> Why? Because Joe Biden was a pedophile whose son f~~ked underage Chinese children and took money from a Ukranian dictator to influence US policy.
In other words, voting for Trump because "samy is my hero." :)
Could there be a class action lawsuit against the various companies whose recommendation engines hijacked people's attention to recommend and reinforce this garbage?
> Could there be a class action lawsuit against the various companies whose recommendation engines hijacked people's attention to recommend and reinforce this garbage?
People like to hand-wring about "the algorithm", but then they seem to fall short[1] of understanding that exposure, impressions, and engagement are sold to the highest bidders on social media platforms. Not only that, the platforms allow fine-grained targeting of users based on tomes of data collected on them.
These recommender systems don't just hijack people's attentions as a side effect of increasing engagement, it is by design in pathologically manipulative and anti-user way.
It isn't a coincidence that those with money and an agenda[2] can inject money into social media platforms and have their content spread like wildfire.
> Could there be a class action lawsuit against the various companies whose recommendation engines hijacked people's attention to recommend and reinforce this garbage?
I think this thought is spot on.
The usual defense is "but free speech!". Which would boil down to: "such is human nature". But I don't believe that's the problem. The problem may indeed be selection and amplification mechanisms like recommendation engines tuned to divert max. attention to the medium, masterfully exploiting the vulnerabilities of the human psyche as evolution formed it. The rest is collateral damage which nobody seems to feel responsible for. Not a sustainable situation.
A good legal question is whether the selection/rejection of content by an algorithm tuned to provide financial benefit to the platform would be considered "editorial control".
If there were a dead simple filter where the user could pick friend groups, tags and sorting criteria to tune their feed this would not be an issue. Reddit, for instance seems relatively simple in that respect- the presentation is a function of the subreddit and the votes.
But once the algorithm is driven by sponsorships, monetization opportunities, and opaque surveillance data then the control of the presentation shifts from user to platform. One could argue that this should creates some liability on the part of the platform.
If the editor of a publication was monetarily compensated for publishing lucrative slander, in a manner designed to maximize its credibility with certain audiences, and it resulted in harm to people then they arguable could be held responsible. If the editor claimed that an algorithm decided to publish it and the publication developed the algorithm I don't think it would make them less responsible.
In other words, voting for Trump because "samy is my hero." :)
Could there be a class action lawsuit against the various companies whose recommendation engines hijacked people's attention to recommend and reinforce this garbage?
Edit: clarification