Hacker News new | past | comments | ask | show | jobs | submit login

This is artfully researched. A great read and a valuable take on events that we'll need as a society to learn from, as the algorithm-powered platforms are not going anywhere.

Very strange to hear any "free speech" arguments in this thread. I can only assume those commenters haven't read the article, which enumerates multiple examples of speech that are equivalent to shouting fire in a crowded theater.




I not only read the article (which was great), but I was in the same space at this time: one of the civil society groups constantly trying to raise issues with Facebook.

It all rings very true to me: I think one thing to note is that not only did Facebook lack Burmese speakers internally, but so did the human rights groups with the best access to Facebook. As Kissane notes, Myanmar came online very quickly after the reforms, and -- unlike many countries in the Middle East or even China, because the story there was not one of authoritarian oppression, but a /release/ from authoritarian pressure, it did not fit easily into the template that tech companies were slowly learning to respond to. A few years before this, Facebook and Twitter had been shaken into some kind of recognition of their responsibilities in the Arab Spring and Iranian protests, but the result of that had been a very, shall we say, US-centric view of how repression plays out globally. Bluntly, the violent genocide of muslims by buddhists, in a country led by Nobel Peace Prize winner and defender of democracy Aung Sang Suu Shi, was a story that had to cut across many political and cultural presumptions of the US and Europe before it would be listened to.

I guess that brings me to your comment about "free speech". It seems a little petty to point you to the usual put-downs about how the phrase "shouting fire in a crowded theater" has historically been deployed to /silence/ people trying to /stop/ mass violence (because they are seen to be trouble-makers).

I'll note though that the failures of Facebook and others at this time came about because people claimed to be able to moderate and provide a forum where only civil discussion and "the truth" would be discussed -- and could be swayed to stop it, if you could just get through to the right people. Many of us, both in "free speech" organizations and on-the-ground humanitarian groups, argued that this was a role that Facebook was not, and could not play: and the more it claimed it could take on this responsibility, the more terrible the consequences would be.


Thank you very much for your thoughtful reply. I appreciate your note about the history of "shouting fire in a crowded theater" as well.

As someone who was trying to open Facebook's eyes to their shortcomings at the time, what do you see as the larger lesson we can glean? It seems likely that blindness (for any reason) to the specific cultural dangers of new tech will become harder and not easier to spot as time goes on and these large organizations become further convinced of the completeness of their own understanding. I'd be curious to hear what you've learned generally that we can apply next time.


I believed, and still believe, that speech and content moderation simply isn't possible at the scales and staffing that Facebook and other tech companies of the 2010s want to operate. Civil society organizations struggled at the time to match and warn Facebook, but I don't think they can scale up either. I was at a human rights-related event the other day and somebody talked about leaving behind the "trashfire" of the Facebook Oversight Board. I can believe it -- it's the sort of solution that you end up having to knit together at those scales. If we seriously believe we can create some sort of global speech government, why haven't we?

Conversations are intimate, contextual, and should be far more directly under the local control of the speakers. This feels counter-intuitive to many when we see modern genocides like Myanmar and Facebook (and before it, Rwanda and short-wave radio) where mass media played its part in fanning the flames. But censorship and control are temporary fixes to those deeper problems -- and it's a solution that feels more comfortable the further you get from the details of each disaster, or the closer and more familiar you are with those with the power to censor.

My instincts (and my work) assumes a lot of the problems come, as you say, from the centralization of these large tech organizations, but of course there are also plenty of challenges at the more human level. It's significant for me that Western traditions of free speech emerged from decades of vicious religious wars, and appear to be more stable than the cycles of repression and counter-repression that proceeded them.


> Conversations are intimate, contextual, and should be far more directly under the local control of the speakers.

Fair enough… but what about people who use Facebook as a way of broadcasting information? (Such as Ashin Wirathu in the linked article.) To me, that case feels very different to ‘intimite, contextual conversations’. And it is fundamental to Facebook’s design that it blurs the distinction between these two cases.

Is the answer then to restrict such things more severely than ordinary conversations? I really have no idea. But our current way of doing things doesn’t seem to be working very well at all.


>I guess that brings me to your comment about "free speech". It seems a little petty to point you to the usual put-downs about how the phrase "shouting fire in a crowded theater" has historically been deployed to /silence/ people trying to /stop/ mass violence (because they are seen to be trouble-makers).

The speech regulation problem feels fairly different depending on whether the team doing the regulation lives in the society they're trying to regulate.

If you live in the society you're trying to regulate the speech of:

* People in your society will attack or praise your speech regulation actions, as moves in the local political chess game.

* As a member of society, you are likely to have a "dog in the fight" for a heated discussion where someone calls for speech regulation.

* As a member of society, your thinking (including your thinking about what to censor) is affected by what you read, which is itself affected by what speech gets censored. There can be a feedback loop.

In the US, "freedom of speech" used to be a left-wing talking point, back when the right had more cultural power. Nowadays it is a right-wing talking point, at a time when the left has more cultural power.

We generally can't expect censorship mechanisms to be used in a principled way. Censorship mechanisms are powerful political tools that powerful people will fight to obtain, and they will be disproportionately wielded by the powerful. See, for example, Elon Musk's purchase of Twitter.

Contrast all that with the Facebook/Myanmar situation, which I suspect is more a case of criminal apathy and/or greed.


I'll admit I don't know anything about the situation, but the phrase "violent genocide of muslims by buddhists" was NOT something I was expecting to hear anywhere.


Indeed, that's part of the point here.


The correct Phrase would be Genocide of Rohinja by a Terrorist Military Junta.

Now they are killing whole country , regardless of age , sex, reglion and skin color.


I read most of the article.

I don't buy the premise of the author. Social media is simply amplifying what the social network things is interesting/valuable. If society thinks some nasty haetful writer is interesting, its boosting that person's content. Its not a surprise that if you had decades of junta, you also had decades of fighting , the groups wouldn't exactly be singing praises of their rivals on social media.

The author never once discusses the obvious - this shouldn't be about restricting speech (which is exactly what is being advocated) , or blaming FB, it was always about the recommendation algorithm.

Destroy the algorithm or disable it, and you literally have no basis for the article.


Why are you phrasing it like the algorithm is somehow separate from Facebook and its choices?


I don't like facebook, but the purpose of facebook is clear: Engage you in using facebook. Facebook was not (at least until the last decade or so) making choices on how that algorithm works. Popular content was sticky, otherwise not.

In that context, i'd argue that it is enhancing whatever the social network around you pushed up. And for the social network that is part of the article, a lot of dark thoughts were part of that. There wasn't a deliberate choice on what to push up. The people chose what to push with their own clicks.

In fact, I'd say the algorithm was (is?) a black box to most FB employees itself!


Facebook was faced with a trolly problem where they didn't pull the lever. As a result, tens of thousands of people were killed and many hundreds of thousands more were victimized and displaced.

Would there still have been violence had Facebook acted? Yes, but it very clearly would have been much less. Surely anyone working at Facebook at the time who had heard these warnings now deeply regrets that they didn't act. (By the way, President Clinton said that failure to intervene in Rwanda was one of his biggest regrets. Imagine if you could simply flip a switch to disable radio broadcasts during that event-- that's the situation Facebook was in.) It doesn't matter whether they were actively promoting content or not.

The idea that the tech company is innocent because technology merely amplifies what already exists in humanity... is completely ridiculous. We all understand human nature so it is easy to predict what will happen. And in this case there were many warnings ahead of time that were simply ignored.


>>>We all understand human nature so it is easy to predict what will happen. And in this case there were many warnings ahead of time that were simply ignored.

We can't even predict tomorrow's weather with precision. Isn't it arrogant to think we can predict humans particularly when they behave differently when making group decisions ?

Don't we have literal disasters that have happened because some people thought they could predict things (SE Asia Domino theory [1], China's Sparrow destrution [2], and many others. Worse, none of the "easy to predict" scenarios can be A/B tested, so there's zero guarantee that doing nothing, is the solution.

[1] https://en.wikipedia.org/wiki/Quagmire_theory [2] https://gizmodo.com/china-s-worst-self-inflicted-environment...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: