I don't like facebook, but the purpose of facebook is clear: Engage you in using facebook. Facebook was not (at least until the last decade or so) making choices on how that algorithm works. Popular content was sticky, otherwise not.
In that context, i'd argue that it is enhancing whatever the social network around you pushed up. And for the social network that is part of the article, a lot of dark thoughts were part of that. There wasn't a deliberate choice on what to push up. The people chose what to push with their own clicks.
In fact, I'd say the algorithm was (is?) a black box to most FB employees itself!
Facebook was faced with a trolly problem where they didn't pull the lever. As a result, tens of thousands of people were killed and many hundreds of thousands more were victimized and displaced.
Would there still have been violence had Facebook acted? Yes, but it very clearly would have been much less. Surely anyone working at Facebook at the time who had heard these warnings now deeply regrets that they didn't act. (By the way, President Clinton said that failure to intervene in Rwanda was one of his biggest regrets. Imagine if you could simply flip a switch to disable radio broadcasts during that event-- that's the situation Facebook was in.) It doesn't matter whether they were actively promoting content or not.
The idea that the tech company is innocent because technology merely amplifies what already exists in humanity... is completely ridiculous. We all understand human nature so it is easy to predict what will happen. And in this case there were many warnings ahead of time that were simply ignored.
>>>We all understand human nature so it is easy to predict what will happen. And in this case there were many warnings ahead of time that were simply ignored.
We can't even predict tomorrow's weather with precision. Isn't it arrogant to think we can predict humans particularly when they behave differently when making group decisions ?
Don't we have literal disasters that have happened because some people thought they could predict things (SE Asia Domino theory [1], China's Sparrow destrution [2], and many others. Worse, none of the "easy to predict" scenarios can be A/B tested, so there's zero guarantee that doing nothing, is the solution.
In that context, i'd argue that it is enhancing whatever the social network around you pushed up. And for the social network that is part of the article, a lot of dark thoughts were part of that. There wasn't a deliberate choice on what to push up. The people chose what to push with their own clicks.
In fact, I'd say the algorithm was (is?) a black box to most FB employees itself!