I have friends who live in Myanmar and can confirm the ground situation was a lot worse. When all your see are hate speech posts ( about beheading or at best driving away these low lives), it normalizes the hate and dehumanize the victims. History was full of atrocities like this . Free speech is great to have . But violence propaganda is a real problem. You see some of that in America too.
Edit : to give people who don’t bother to read the article some idea, many villages were burned down and before that happened , women got raped and killed , the entire village were looted and then killed and then they burned down the villages. The goal was to create fear so that the victims would have no choice but flee their own country. It worked .
You have to ask yourself why are we obsessed with free speech but you never hear anyone preaching free act? Why the arbitrary restriction to speech/expression? Ah, acts can cause harm to others; people can't just be allowed to do whatever they want, though they should be allowed to say whatever they want, because you can't harm other people by what you say, according to ... the "sticks and stones" principle.
The entire idea of free speech really rests on something as shaky as the sticks and stones principle!
Free speech lets us as a society determine which acts should be restricted.
My objection to speech restrictionists is that they rarely give a robust mechanism for deciding which speech should be restricted, a mechanism that's hardened against people abusing it to further their narrow self-interest.
Speech restrictionists also tend to ignore the "circular dependency problem": without anyone to defend a position, how do we know if that position is defensible? For example, suppose you live in a theocracy. You're an atheist. You start making your case for atheism. Just as you're about to make your case, the theocrats interrupt: "This speech is killing people. It's preventing them from reaching the blissful afterlife by converting them to atheism. This person is attempting eternal murder." And then throw you in jail.
I'm in favor of a well-designed terms and conditions for a platform like Facebook. But I also think it's too easy for people to say: "I don't like what you said! You shouldn't be allowed to say it!" The de facto impact of that rhetoric essentially amounts to mob rule.
A lot of the problem is definitions. I'm not a free speech 'absolutist', do I guess I'm 'restrictionist'. But what defines 'speech'. It can cover the spoken word, the written word, communication in art, photograph, video.
If a King said "off with his head", and someone carried out that order in a country where this was illegal, who did wrong? Is the King simply exercising his rights to free speech? What about paedophiles sharing images?
I suspect that it comes down to power imbalances. And that is something difficult to measure. A dictator and an influencer both have power to cause a lot of harm in what they say, as does an adult who shouldn't be near a child.
I agree with your example of the atheist, and in fact that has been the case for hundreds of years. The 'harm' in that case is something that people disagree on. An atheist and most scientists would disagree in the harm caused in that situation.
I'm a bit skeptical of the "power imbalances" approach. I don't think it becomes fine to say "off with the politician's head" just because the speaker is powerless. Additionally, if "power imbalances" are a core factor in your decision-making, that incentivizes people to argue that they are powerless, and "oppression olympics" discussions of this kind are hard to adjudicate. (For example, arguing for your own powerlessness works better if you already have some power. Impoverished people in developing countries have very little power, and often lack the time, money, and language skills necessary to communicate with us.)
If I was writing a terms & conditions for a site like Facebook, I would try to distinguish between speech that aims to inform and speech that aims to intimidate. In general, I think the bar for censoring the former should be a lot higher. But it's not a hard rule, e.g. I wouldn't censor intimidating speech like "if you don't stop I'm gonna call the police".
I'm a lot more willing to censor speech directed at children, relative to speech directed at adults. Censoring speech directed at children doesn't run into the problems I discussed in my comment. (E.g. the circular dependency problem doesn't apply, because adults can still argue about what's OK to say to a child.)
> Speech restrictionists also tend to ignore the "circular dependency problem": without anyone to defend a position, how do we know if that position is defensible? For example, suppose you live in a theocracy. You're an atheist. You start making your case for atheism. Just as you're about to make your case, the theocrats interrupt: "This speech is killing people. It's preventing them from reaching the blissful afterlife by converting them to atheism. This person is attempting eternal murder." And then throw you in jail.
How is that supposed to be specific to speech? This argument generalize to any prohibition: just because a dictator could use a law to put you in jail doesn't mean laws (or even jails) are inherently bad!
As a extreme example: should we allow rape or pedophilia because religious leaders in power are known to put people in jail for their sexual behavior? Obviously being put in jail for what you do in your sexual life is terrible, right, but that doesn't mean that the law has no say in any human sexual behavior either…
No right is absolute, it's always subject to the law. And all you can wish for is having a law for your country that sets the right balance between freedom and protection, and that's why democracy is important: so that the people can actually have a say on the balance, hopefully moving it to match the moral values of the society (though within the society there will always be debates on the balance, this is inevitable).
stop mixing together actions meaning done (or doable) deeds, and statements such as speech
an extreme example would be to ban all fictional writing because of the actions some reader may possibly take due to reading a about some terrible awful action.
> stop mixing together actions meaning done (or doable) deeds, and statements such as speech
This is a completely arbitrary distinction.
In fact there are many crimes for which speech is enough to get jailed, like if you give instruction for a murder, “all” you did was speech. Racketing? Speech. Scam? Speech. Fiscal fraud? You just lied (=speech) to the IRS. Death threats? Speech. Harassment? Speech. Violation of Secret Defense? Speech.
Should all of the above be legal because speech is supposed to be special?
Doch[0], never limit yourself to a single example.
I grew up in the UK, where teachers at the time were banned by law from saying it was OK to be gay:
> "shall not intentionally promote homosexuality or publish material with the intention of promoting homosexuality" or "promote the teaching in any maintained school of the acceptability of homosexuality as a pretended family relationship"
And there are socially taboo topics today. I suspect even mentioning some of them here might create a flame war and annoy dang, so I won't.
Instead I'll point to drug references in films and TV, where weed was for a long time as taboo as LSD and heroin; and that in Anglosphere media, sex is more taboo than violence ("Straight up murder? Put that in a kid's film. Nipples, on mammary glands, the defining characteristic of mammals and a thing that infants have a biological imperative to stick into their mouths in order to not starve to death before the invention of fake ones on milk bottles? Banned for being too sexual.")
Despite these examples of mistakes when restricting speech, I am not a free speech absolutist. This is because I'm not an anything absolutist: there are limits to all things, finding the true boundaries isn't as trivial to pointing out the first two examples that come to mind, if that's one on either side saying the standard is half way between them, if they're on the same side rejecting the possibility of the other.
[0] a German word that should exist in English: to be used to deny a negative, where "yes" or "no" might be ambiguous.
I don't think "doch" is particularly fitting, because the GP did not make a negative statement (rather a positive one about not looking further than some limit). Its usage felt weird to me as a response to the GP.
I guess for German speakers it fits in to more places. As a non-German speaker, at home I use it to express agreement with a statement where saying "yes" or "no" would be ambiguous.
I lean towards absolutist position but I think the big problem is algorithms here. If algorithms are promoting something then it is no longer a simple case of free speech.
I hate it that these megacorps are hiding behind free speech when they are essentially acting like editors and sponsors. They will destroy free speech in order to increase their engagement metrics.
Because since we live in a post-enlightment society we believe that we should discuss everything, especially before doing something. We should be free to argue and discuss whatever whenever, but not do anything on a whim.
> The entire idea of free speech really rests on something as shaky as the sticks and stones principle!
I think that, rather, it rests on observation that restricting speech is basically jamming the communication in society, which creates ossification and prevents development.
Indeed, you do have to ask yourself, you do have to learn something about this, rather then making up your own headcanon, and a good place to start is John Stuart Mill's
Of the Liberty of Thought and Discussion: https://en.wikisource.org/wiki/On_Liberty/Chapter_2
For that to be true, you would have to deny the well-adjusted-ness of significant fractions of the population of Myanmar in recent years; of the persecutors of the Uyghur, the Yazidis, the Darfur genocide, the Effacer le tableau, the Hutus, the Rwandan genocide, Bosnian genocide, Isaaq genocide, Anfal genocide, …
…, the forces responsible for, and senior to, the Mai Li massacre, …
…, the general civilian population voting for the explicitly racist Nazi party, …
Why? Because these things only happened as a result of the fact that speech is convincing.
Best you can do here is say "those people are not well-adjusted", which is fine except for where the mal-adjustments come from: speech.
Think about it in reverse: if speech had no power to change us, it would not even matter if it was free or not.
Speech policing isn't some magical panacea warding off conflict and violence. I can easily cite you historical exterminations at the hands of speech enforcers.
I'm not claiming it is a panacea. I'm saying the absence isn't one either.
The benefits and dangers of free speech are two sides of the same coin: the capacity to convince others to act in accordance with your beliefs.
"Well adjusted adults" as a phrase can only point to those who are within the Overton Window of their society, regardless of whether that society calls for things I condemn, or not.
This means that well-adjusted adults can, will, and have, formed lynch mobs upon hearing rumours that someone has eaten beef.
Nearly all (or all) of these regimes practiced, by far, more speech censorship than the others in order to reduce contradicting views; so the causality in these examples is pointing in the wrong direction.
To be more clear, these regimes arose because of the lack of freedom of speech, not because of the excess of it.
What do you mean by not seeing the "free act" preaching? I believe that the American concept of freedom is precisely that, with free speech being just one, albeit particular, subset of that general freedom. Take the first amendment: all about free exercise of religion, the freedom of speech, the freedom of the press, the freedom of assembly. And then, the fifth amendment is about being "Innocent Until Proven Guilty". So, free to act, until it's proven to be against the specifically forbidden things.
I know that reality has much more nuance, but still, it's not the point. Freedom is speech is not alone.
I don’t see that distinction as arbitrary at all. It seems like, among no other good or obvious options, exactly the place to put the line between free speech and restricted action.
There is much “bad” free speech, but “bad” action is intolerable. The distinction between the results of bad speech and bad action are clear - even if it is unfortunately true that speech can inspire action.
It seems to me that there's an obvious alternative option. After all, if the reason we find the idea of free speech intuitively appealing is because we subscribe to some version of "speech can do no harm", why not just base everything upon the more fundamental harm principle ("one is free to V as long as V-ing causes no harm to others") which covers both the cases of speech and of act?
It's impossible to act in a world of limited resources without "harming" other people. The division between speech and action is practical under the circumstances.
Because language is software, mind is hardware and acts are behavior, and we don't have toolings necessary to manipulate the software and it's dangerous to do things without proper toolings. Anyone on the topic of specifically freedom of speech and not general freedom are not real anarchists or anything of that sort if ... that is what you are after. But I'm not sure what your stance even is in the first place...
> The entire idea of free speech really rests on something as shaky as the sticks and stones principle!
The "Free Speech means I can say anything I want on any platform" stance is actually very new in the United States.
This only started after the counterculture in the late 1960s-70s, when libertarian individualism (yes, the hippies had a very socially libertarian stance) in the New Left and the New Right lead to the revocation of laws such as the Fairness Doctrine, the Comstock Act, etc.
Before the 1970s-80s, American Free Speech doctrine was much closer to what you see in Canada, the UK, or Germany today - you are free to criticize and protest against the US Govt, but that doesn't mean you are free from moderation or censorship on a private or
public platform.
Jurisprudence in the 70s-80s changed that by essentially removing the need for moderation, and that's how we have the irresponsible form of free speech that we have in the US today.
It's essentially the paradox of tolerance. The solution to that paradox is to ignore it. To quote Chidi Anagonye, this is why people hate moral philosophy professors.
It was happening since before the coup, and has been happening since independence. Kachin, Saigang, Rakhine, Shan, and Chin States have all been facing violence insurgencies since independence.
It’s ethnic cleaning and we know that it works and you can get away with it. That’s what the early founders of Israel did and hundreds of thousands of Palestinians fled and remain refugees to this day. Some of those who implanted these strategies went on to become leaders in Israeli society.
Oh please offer a rebuttal. The HN community is one of the most well-informed and logical communities. If you are right and I am just spewing propaganda on behalf of Palestinian refugees then they will support you and you will have helped advance the truth! JS Mills will be proud of you.
Why not? Is there a significant difference between (speculating about) those (alleged) genocides? I don't know anything about the events GP linked, but if someone made a (well-researched) post about them I would read it just as I've read this one.
What does the words of an idiot like Abbas have to do with systematic ethnic cleansing by Jewish paramilitary/terrorist groups like Haganah/Lehi/Irgun in the 1930s and 1940s? Was Abbas the reason they murdered villagers to scare away the population? Was Abbas the reason they assassinated prominent Palestinian Jews who opposed the foundation of Israel on Palestinian lands? Was Abbas the reason they bombed hotels and engaged in terrorism?
All this is just pure propaganda. I live here ... 40% of population is arab here in Jerusalem ... We walk in the same streets, we shop in the same malls, we study in the same universities, we work together.
90% of "refugees" where born in Egypt or Jordan or other arabs countries.
So your propaganda about ethnical cleansing and apartheid make Zero sense when arabs are doctors in our hospitals or members of parlement in our government...
Maybe you should just stop repeating Palestinian propaganda and come see by yourself?
That two people are walking and living together is amazing to read honestly — there is hope for accepting Palestinians. Especially when you read about other areas of Palestine where settlers are harassing natives or areas of Israel where the government is trying to move nomads out.
At the same time I will say that your comment on the refugees being born in a foreign country shows that you have written them off. These are humans remember? Their parents were stripped of property and driven to neighboring countries condemned to live as foreigners only hundreds of kilometers away from home. Isn’t it a tragedy when second generation refugees live next door to where their parents lived? All so some European settlers move in and take their lands and homes? We can’t just write them off as “not even born in Palestine”.
What part of the Myanmar justice system failed?
How do people get away with raping and killing people on camera with their identities known by the wider public without going to jail?
Are you asking a serious question? Myanmar is a failed state which lacks the rule of law. There is an ongoing civil war and the central government (which is widely considered to be illegitimate) doesn't even have effective control over large parts of the country.
https://en.wikipedia.org/wiki/Habeas_Corpus_Suspension_Act_(...
When the US entered the civil war the Justice system failed in some ways. And we can track that partially to the suspension of Habeas Corpus. This is the kind of serious answer I would give if someone asked when the US Justice system failed.
The start of the Rohingya genocide precedes the civil war. The military controlled troops and the civil government controlled police in genocidal operations. A lack of rule of law, but not in the sense that some benevolent government just lacked power to act.
This is artfully researched. A great read and a valuable take on events that we'll need as a society to learn from, as the algorithm-powered platforms are not going anywhere.
Very strange to hear any "free speech" arguments in this thread. I can only assume those commenters haven't read the article, which enumerates multiple examples of speech that are equivalent to shouting fire in a crowded theater.
I not only read the article (which was great), but I was in the same space at this time: one of the civil society groups constantly trying to raise issues with Facebook.
It all rings very true to me: I think one thing to note is that not only did Facebook lack Burmese speakers internally, but so did the human rights groups with the best access to Facebook. As Kissane notes, Myanmar came online very quickly after the reforms, and -- unlike many countries in the Middle East or even China, because the story there was not one of authoritarian oppression, but a /release/ from authoritarian pressure, it did not fit easily into the template that tech companies were slowly learning to respond to. A few years before this, Facebook and Twitter had been shaken into some kind of recognition of their responsibilities in the Arab Spring and Iranian protests, but the result of that had been a very, shall we say, US-centric view of how repression plays out globally. Bluntly, the violent genocide of muslims by buddhists, in a country led by Nobel Peace Prize winner and defender of democracy Aung Sang Suu Shi, was a story that had to cut across many political and cultural presumptions of the US and Europe before it would be listened to.
I guess that brings me to your comment about "free speech". It seems a little petty to point you to the usual put-downs about how the phrase "shouting fire in a crowded theater" has historically been deployed to /silence/ people trying to /stop/ mass violence (because they are seen to be trouble-makers).
I'll note though that the failures of Facebook and others at this time came about because people claimed to be able to moderate and provide a forum where only civil discussion and "the truth" would be discussed -- and could be swayed to stop it, if you could just get through to the right people. Many of us, both in "free speech" organizations and on-the-ground humanitarian groups, argued that this was a role that Facebook was not, and could not play: and the more it claimed it could take on this responsibility, the more terrible the consequences would be.
Thank you very much for your thoughtful reply. I appreciate your note about the history of "shouting fire in a crowded theater" as well.
As someone who was trying to open Facebook's eyes to their shortcomings at the time, what do you see as the larger lesson we can glean? It seems likely that blindness (for any reason) to the specific cultural dangers of new tech will become harder and not easier to spot as time goes on and these large organizations become further convinced of the completeness of their own understanding. I'd be curious to hear what you've learned generally that we can apply next time.
I believed, and still believe, that speech and content moderation simply isn't possible at the scales and staffing that Facebook and other tech companies of the 2010s want to operate. Civil society organizations struggled at the time to match and warn Facebook, but I don't think they can scale up either. I was at a human rights-related event the other day and somebody talked about leaving behind the "trashfire" of the Facebook Oversight Board. I can believe it -- it's the sort of solution that you end up having to knit together at those scales. If we seriously believe we can create some sort of global speech government, why haven't we?
Conversations are intimate, contextual, and should be far more directly under the local control of the speakers. This feels counter-intuitive to many when we see modern genocides like Myanmar and Facebook (and before it, Rwanda and short-wave radio) where mass media played its part in fanning the flames. But censorship and control are temporary fixes to those deeper problems -- and it's a solution that feels more comfortable the further you get from the details of each disaster, or the closer and more familiar you are with those with the power to censor.
My instincts (and my work) assumes a lot of the problems come, as you say, from the centralization of these large tech organizations, but of course there are also plenty of challenges at the more human level. It's significant for me that Western traditions of free speech emerged from decades of vicious religious wars, and appear to be more stable than the cycles of repression and counter-repression that proceeded them.
> Conversations are intimate, contextual, and should be far more directly under the local control of the speakers.
Fair enough… but what about people who use Facebook as a way of broadcasting information? (Such as Ashin Wirathu in the linked article.) To me, that case feels very different to ‘intimite, contextual conversations’. And it is fundamental to Facebook’s design that it blurs the distinction between these two cases.
Is the answer then to restrict such things more severely than ordinary conversations? I really have no idea. But our current way of doing things doesn’t seem to be working very well at all.
>I guess that brings me to your comment about "free speech". It seems a little petty to point you to the usual put-downs about how the phrase "shouting fire in a crowded theater" has historically been deployed to /silence/ people trying to /stop/ mass violence (because they are seen to be trouble-makers).
The speech regulation problem feels fairly different depending on whether the team doing the regulation lives in the society they're trying to regulate.
If you live in the society you're trying to regulate the speech of:
* People in your society will attack or praise your speech regulation actions, as moves in the local political chess game.
* As a member of society, you are likely to have a "dog in the fight" for a heated discussion where someone calls for speech regulation.
* As a member of society, your thinking (including your thinking about what to censor) is affected by what you read, which is itself affected by what speech gets censored. There can be a feedback loop.
In the US, "freedom of speech" used to be a left-wing talking point, back when the right had more cultural power. Nowadays it is a right-wing talking point, at a time when the left has more cultural power.
We generally can't expect censorship mechanisms to be used in a principled way. Censorship mechanisms are powerful political tools that powerful people will fight to obtain, and they will be disproportionately wielded by the powerful. See, for example, Elon Musk's purchase of Twitter.
Contrast all that with the Facebook/Myanmar situation, which I suspect is more a case of criminal apathy and/or greed.
I'll admit I don't know anything about the situation, but the phrase "violent genocide of muslims by buddhists" was NOT something I was expecting to hear anywhere.
I don't buy the premise of the author. Social media is simply amplifying what the social network things is interesting/valuable. If society thinks some nasty haetful writer is interesting, its boosting that person's content.
Its not a surprise that if you had decades of junta, you also had decades of fighting , the groups wouldn't exactly be singing praises of their rivals on social media.
The author never once discusses the obvious - this shouldn't be about restricting speech (which is exactly what is being advocated) , or blaming FB, it was always about the recommendation algorithm.
Destroy the algorithm or disable it, and you literally have no basis for the article.
I don't like facebook, but the purpose of facebook is clear: Engage you in using facebook. Facebook was not (at least until the last decade or so) making choices on how that algorithm works. Popular content was sticky, otherwise not.
In that context, i'd argue that it is enhancing whatever the social network around you pushed up. And for the social network that is part of the article, a lot of dark thoughts were part of that. There wasn't a deliberate choice on what to push up. The people chose what to push with their own clicks.
In fact, I'd say the algorithm was (is?) a black box to most FB employees itself!
Facebook was faced with a trolly problem where they didn't pull the lever. As a result, tens of thousands of people were killed and many hundreds of thousands more were victimized and displaced.
Would there still have been violence had Facebook acted? Yes, but it very clearly would have been much less. Surely anyone working at Facebook at the time who had heard these warnings now deeply regrets that they didn't act. (By the way, President Clinton said that failure to intervene in Rwanda was one of his biggest regrets. Imagine if you could simply flip a switch to disable radio broadcasts during that event-- that's the situation Facebook was in.) It doesn't matter whether they were actively promoting content or not.
The idea that the tech company is innocent because technology merely amplifies what already exists in humanity... is completely ridiculous. We all understand human nature so it is easy to predict what will happen. And in this case there were many warnings ahead of time that were simply ignored.
>>>We all understand human nature so it is easy to predict what will happen. And in this case there were many warnings ahead of time that were simply ignored.
We can't even predict tomorrow's weather with precision. Isn't it arrogant to think we can predict humans particularly when they behave differently when making group decisions ?
Don't we have literal disasters that have happened because some people thought they could predict things (SE Asia Domino theory [1], China's Sparrow destrution [2], and many others. Worse, none of the "easy to predict" scenarios can be A/B tested, so there's zero guarantee that doing nothing, is the solution.
The attack on Meta here isn't really justified for the reasons Kissane seems to be providing. Take this paragraph, for example:
> By that point in 2018, Myanmar’s military had murdered thousands of Rohingya people, including babies and children, and beaten, raped, tortured, starved, and imprisoned thousands more. About three-quarters of a million Rohingya had fled Myanmar to live in huge, disease-infested refugee camps in Bangladesh.
That is the situation on the ground. The country is a powder keg, and there is evil afoot. Maybe there is more context here to come in part II, but it seems unreasonable to lay this at Facebook's feet from an external perspective. Facebook does not call for beatings, rape or killings.
I can certainly see Facebook's leadership caring about the PR and doing something. But that raises alternative problems - what exactly is the standard Facebook is meant to enforce? Should it ban everyone who is responsible for pointless killing? That includes the US leadership apparatus - should it ban everyone who voted Aye for the Iraq war? Afghanistan? Wars like Vietnam? (yes, the people who voted for it knew better). They won't do that. We'll have a large, powerful company applying uneven and subjective standards.
There is an issue there, but I think the specifics of why she is calling out Facebook are actually dangerously missing the point. The problem is the same no matter what influence Facebook tries to have - the problem is that the will of the people and the will of Facebook's leadership are radically different. In this case, the will of the people is unusually evil - but the problem is the clash and Facebook's position of influence, not which side is wrong on any particular day. Eventually, Facebook will be on the evil side.
Facebook didn't remove posts promoting ethnic violence in a timeous manner. In fact, it did the opposite, its algorithms promoted these types of inflamed posts to increase user engagement. It's not hard to understand why Meta is responsible unless you're being disingenuous; you didn't touch on that aspect. Facebook is not some harmless message transceiver. And not to mention, Facebook didn't have sufficient native speakers to moderate the content. So at worst, they're negligent.
There are responsibilities to allowing people on your platform and having them speak to the internet; free speech is not absolute.
The article mentions a murderous Mynamarese monk [0]. I think he is responsible. And people like him.
There is a real question here about whether Facebook's algorithms created the zeitgeist or just reflected it. Since most countries have Facebook and most countries are avoiding Facebook-fueled mass murder sprees, I suspect it is more that their algorithms accurately reflected what a large number of people were thinking.
It is an interesting question, and maybe there will be compelling arguments made in Part II. But that hasn't happened in Part I.
[0] The gap that can emerge between the priests of a religion and the religious ideals would stun the naive. If this article is accurate then the man might be an anti-Buddha.
> As the violence in Mandalay worsens, the head of Deloitte in Myanmar gets an urgent call. Zaw Htay, the official spokesman of Myanmar’s president, needs help reaching Meta. The Deloitte executive works into the night trying to get someone to respond, but never gets through. On the third day of rioting, the government gets tired of waiting and blocks access to Facebook in Mandalay. The block works — the riots died down. And the Deloitte guy’s inbox suddenly fills with emails from Meta staffers wanting to know why Facebook has been blocked
The article mentioned that the zeitgeist was that the communities lived in peaceful indifference, but political and religious leaders with hate, only through the mass communication ability of Facebook, were able to leverage their authority status to radicalize the common man to believe a funny hateful man on the Internet over their local customs and mores.
Leaders calling for racial genocide isn't really at that "fine end". Meta should be able to distinguish between that and "corporations should pay less taxes".
> There is a real question here about whether Facebook's algorithms created the zeitgeist or just reflected it.
That is a false dichotomy. It is possible for Facebook's role to have been to exacerbate existing religious tensions for profit. They are an actor in this story. Not either a "creator" of zeitgeist or an object with no agency.
>They’d been shown example after example of dehumanizing posts and comments calling for mass murder, even explicitly calling for genocide. And David Madden had told Meta staff to their faces that Facebook might well play the role in Myanmar that radio played in Rwanda. Nothing was subtle.
>After all that, Meta decided not to dramatically scale up moderation capacity, to permanently ban the known worst actors, or to make fundamental product-design changes to reliably deviralize posts inciting hatred and violence.
>Instead, in 2016, it was time to get way more people in Myanmar onto Facebook.
>That’s next, in Part II: The Crisis.
These would have been reasonable actions that Meta had a responsibility to take.
I have seen discussion of this problem by anti-tech writers become "Facebook caused these atrocities". I agree with you that it's overstating Facebook's role.
However, I don't think that places Facebook's conduct beyond criticism. I'm very interested to see part 2, as it sounds like there are more positive acts by Facebook at that time. Rather than just the passive inaction which characterises part 1.
But, "refusing to enforce your own policies" is action, and it sounds like Facebook is very guilty of this.
I stress that I'm waiting for Part II, because I don't think she's really gotten to the meat of it. But so far we've got
Case 1: Facebook doesn't enforce their policies. There is a genocide.
Plausible Case 2: Facebook enforces their policies. There is a genocide.
Plausible Case 3: Facebook enforces their policies, shapes the future of a country and learns they are bigger than the people. They start actively influencing elections worldwide to shape a pro-Facebook political consensus.
Case 2 is better than 1, but given 3 there is a strong argument that the conversation is being misframed focusing on Facebook's policies. The case where they act and succeed is just as worrying as when they do nothing. Possibly more so, given that there is no evidence at all that Facebook pushing political opinions will result in good outcomes. The mass media have traditionally been cheerleaders for any number of atrocities and help shield corrupt people.
Why even consider these counterfactuals? Is asking Facebook to not amplify bald-faced calls for ethnic cleansing too much here? Can we agree that Facebook should make a conscious effort to NOT do that, and failing that withdraw from the market?
> Is asking Facebook to not amplify bald-faced calls for ethnic cleansing too much here? Can we agree that Facebook should make a conscious effort to NOT do that, and failing that withdraw from the market?
Sounds good to me. If you meet anyone who disagrees with any of that, let me know. But I don't think Kissane is aiming for that with this article. We don't have the full story yet, but it looks like it might be an attack on Facebook, using the crimes of the Myanmarese as emotional fuel.
> If you meet anyone who disagrees with any of that, let me know
Facebook appears to have disagreed with that; they amplified calls for ethnic cleansing and did not respond to concerns about it, so they must have believed that asking them not to was too much. That's the point.
That point hasn't been made. That is why I'm putting some time into this comment chain - we've got the journalism thing happening where people did terrible things, and people used Facebook, and then the journalist is painting Facebook as guilty by association without saying much specific that provides a link. And then letting low-empathy readers join the dots without considering what the people involved were likely thinking.
Bad people use Facebook. We don't need evidence to know that. This article is strong evidence that very bad people use Facebook, but it isn't at all clear that Facebook should be considered morally involved based on what has been presented seen so far.
Maybe the killing blow is yet to come. But I'm pretty sure any objective standard that gets Facebook in trouble here will get them in just as much trouble for letting Victoria Newland or US 4 star generals post publicly. There are a lot if brutes in public office.
Furthermore getting involved in matters of war and peace is not a role that Facebook will get praise for, it'll do some really terrible things if it goes down that path. They should be biased towards inaction. Even and especially if they care.
Yes, this may have happened anyway. Yes, Facebook is not fully responsible. But I disagree with you. The lines are clear.
Facebook de facto became the internet in a country of ~50 million people through subsidising their platform through free data access.
Their platform was developed in order to further their own goals - through maximising engagement and monetisation.
The second order effects of their own personal ambition was enabling people like Wirathu to reach hundreds of thousands of people with hate speech and calls for genocide.
Facebook were informed of this multiple times and allegedly, did nothing about. During this time they had 1 Burmese speaking moderator.
Stating that they have no moral responsibility for the consequences of their actions is in my opinion horseshit. But it does align with certain aspects of the current American zeitgeist of entrepreneurship, free speech and platform "safe harbour" regulations.
This is not a view shared everywhere and should not be assumed when American tech companies scale out of the US. Thankfully this dogmatic approach is being regulated by the likes of the EU and other countries so these platforms are more aligned with their own moral frameworks.
Personally, I find Facebook absolutely morally responsible for parts of this. Just through the simple fact that provided a platform for tens of millions of people - with severely lacking moderation - all in the chase of growth and profits.
This isn't exporting "freedom and democracy" to the world like the good old days. This is abhorrent profit maximisation with no regards for the consequences of their actions, hidden behind a thin veneer of moral rationalization.
I upvoted several posts of yours in this thread but man, are you implying FB was not actively promoting one side? I don't even believe it was "the algorithm".
Stuff like this does not happen by accident, nor in a vacuum. The only reason we don't hear more about this is that important people don't want us to.
Don't play the naivety card in topics like this.
Edi: downvote all you want, you horrible apologists. FB is a weapon and you know it.
> what exactly is the standard Facebook is meant to enforce? Should it ban everyone who is responsible for pointless killing? That includes the US leadership apparatus - should it ban everyone who voted Aye for the Iraq war? Afghanistan? Wars like Vietnam? (yes, the people who voted for it knew better). They won't do that. We'll have a large, powerful company applying uneven and subjective standards.
I completely agree that Facebook isn't well placed to make these decisions, however the architects of the Iraq War faced almost no consequences. I wouldn't be upset if Facebook banned Bush. I agree with you in principle but the example you chose is exactly one that I would be comfortable with. Incidentally Facebook did ban a different former president, so I guess your point about Facebook's position of influence is true.
Basically Facebook were the equivalent of radio stations in Rwanda. The difference being if facebook had assigned adequate resources, they could have removed egregious content.
Would that have stopped the killings? It might have slowed down the spread of rage.
Or it could have furthered censorship and propaganda. It's easy to sit on your hands and think the status quo is fine in Rwanda if every bit of footage revealing how bad it was is suppressed.
Radio is a tool, used to terrible effectiveness by several groups to further their aims. Rwanda was and is _horrific_. > 3/4million people died. So was Bosnia, Timor, Lebanon.
You can't go from 0-100% genocide without a concerted effort.
Should facebook have done more? you fucking betcha. Are they guilty of fanning the flames? totally.
are they responsible for the genocide? no. That is very much down to the military junta, aided by Aung san suu kyi.
Several NGOs tried at the time to raise awareness that a possible genocide was coordinated on Facebook. Facebook did nothing to interfere with this, despite that their platform is not only a passive message board, but actively features posts with high "engagement" to as many users as possible.
We have to be able to value free speech and a free exchange of ideas, and accept that this means we will have shady web sites inciting war and violence, while at the same time holding mass media responsible for their actions.
Whoever runs Facebook should be just as responsible as said shady web site. We should be talking personal responsibility in a literal legal manner here, not having philosophical arguments.
Keep in mind that Facebook at the time still tried to win over web forums, had the Arab spring in their back and did not miss a chance to talk about how they could change the world and had the political clout to do it.
I'm intrigued by this language. What do you mean by it? Do you believe in evil, are you religious? Do you believe people can be evil, or are you talking more about evil acts? I'm just curious because it stood out for me.
It is fair to say I'm irreligious. And, arguably, that I do not believe in evil. But we have a word, 'evil', and I feel it is the most appropriate word to describe people who coordinate genocides and/or mass violence.
Don't know anything about Myanmar. But I would like to add that we should not confuse/bias role of medium, the message and participants. In India, I see maximum no. of toxic messages flowing on WhatsApp groups. These are personal have not ranking or recommendations algorithm, just people forwarding the messages which is primarily because of the increased radicalization of the underlying society.
The problem is the scale. WhatsApp groups of colleagues or friends have a very limited reach and exposure.
People don’t suddenly wake up radicalized. They are introduced and brainwashed into it by other people via some communication medium.
Facebook just happens to not only be the largest but to also boost and promote inflammatory content via its recommendation algorithm.
It's a sad story. As much as I like "free speech" to be the answer, this is a pretty chilling downstream result of a platform not being able to do heavy human moderation.
I don't get this though, even in the US how can you perform moderation at such a large scale? Should we moderate the whole internet as well? Internet was always moderated within each communities. If you have a forum, or a page, or a group chat, or whatever, then it's on you to moderate your community. The same is true for a church, or a volleyball club, or a school, etc.
> As much as I like "free speech" to be the answer, this is a pretty chilling downstream result of a platform not being able to do heavy human moderation.
A net income of $23 billion in 2022. Meta is willingly not doing heavy human moderation; profit is more important than genocide.
You're suggesting a lack of heavy moderation results in genocide? I can point you to a million examples where that hasn't happened.
If the warehouse full of moderators don't show up to work one day, genocide doesn't therefore happen. The answer isn't slash 'n burn moderation at scale. A far better idea is good old transparency, raw information. No censorship, only improved contextual information, up and down screens, with accurate helpful facts and related info. It's proven to be useful and does actually balance false information right when it happens.
That's what you do to respect people, even those in need of educating and correcting from violent ways. They have internet phones, if it's not Meta/FB, it's something else.
>They have internet phones, if it's not Meta/FB, it's something else.
The article discusses the fact that Facebook was widely subsidized for users by exempting it from data consumption, which allowed it to out-compete other news apps. So Facebook actively marketed itself to the dominant position which is then defended through network effects. This was not a free market, and if a company is going to actively seek monopoly on information access then it does have a responsibility over that information.
Putting all philosophical arguments aside, the very simple fact is that Facebook could have helped prevent genocide.
If this topic were "simple facts", the main article wouldn't be more than 15,000 words over 2 parts. Would you conclude the "postman could have helped prevent crime" if only he had not delivered the mail?
> Putting all philosophical arguments aside
Why should we do that? Maybe Facebook is a victim. Their services were used for serious crime, in violation of their terms. The mobile internet is powerful. If any given community can't be trusted with mobile internet, that's not the fault of mobile internet. All the other concerns such as market dominance and cornering the market, are for sure dubious and sleazy in all the usual ways. But not "genocide level" unethical.
Read this. Once I started, I couldn't stop. It's a far quicker read than it looks (though this is just part 1) and utterly essential.
Kissane has put together a concise, readable, and horrifying account of both the background of the Rohingya genocide and how Meta actively fanned the flames.
Her argument for Meta's culpability is simple and effective. Meta faced two problems in Myanmar: (1) not enough Facebook users, and (2) increasing incitement and hate speech that was clearly and repeatedly stoking anti-Rohingya violence. The company repeatedly showed that it was only interested in growing the user base, even though they were clearly aware of how that growth fed a genocide.
Is there another side to the story? Seems like revenue on (at the time) a couple million users in Myanmar would not be worth the PR hassle. What’s Facebook’s incentive here? What content are they supposed to ban, and why is that content so popular with Burmese people?
Corporate inertia. There wasn't clear ownership within FB to manage these kinds of political minefields back then. You have to remember that this was the 2010-2015 timeframe and social media was riding high due to the impact it had during the Arab Spring only a couple years prior.
> why is that content so popular with Burmese people
Because society in Myanmar is extremely communalized, with both ethnic strife among the 70 different ethnic groups and also religious strife between the 4 main religions.
Myanmar has been in a state of civil war since independence, and this is further exacerbated by regional powers fueling the flames by supporting militias along with the army.
I would say corporate inertia but also a mix of corporate immaturity and single minded incentives.
Meta’s corporate culture is single mindedly focused on engagement. They don’t care about side effects or other ambitions. They don’t have a corporate structure that encourages anything else that may be beneficial to their consumer base.
It means that anything that detracts from that engagement is counter to any personal goals an employee might have. And if you can’t personally relate to the problem, you’re far less likely to put your career on the line to push for something that runs contrary to the singular goal of engagement.
> Meta’s corporate culture is single mindedly focused on engagement. They don’t care about side effects or other ambitions. They don’t have a corporate structure that encourages anything else that may be beneficial to their consumer base.
I dont really see how that is unique to Meta though. The mandate of any corporation is essentially sociopathic. Unless the leaders really go out of their way for the company to behave in ways that are perhaps good for people (customer or not) and bad for the company the company is just going to be as selfish as it can be.
what is unique is combining it with metas scale and ubiquity in the space of social media. Especially in developing countries where they subverted net neutrality to become the de facto source of information.
Meta could incentivize other metrics like consumer mental health, accessibility or subject education as well. Those are harder to track, so I’m not saying it’s easy, but other companies have done it.
Meta just does not care or want to care. They’ve been told point blank for over a decade now that they need to do something and haven’t unless regulatory bodies start threatening them.
Facebook isn’t a dumb pipe. It is an algorithm which preferentially selects content which will increase engagement, in this case apparently sparking a genocide. Your analogy falls flat.
And the algorithm that maximizes engagement is the algorithm that gives the user exactly the content he most wants to engage with. In other words, with iteration, it will converge on an algorithm whose output is entirely driven by the user, not the developer. In this way it is basically a dumb pipe just like a phone that lets the user say exactly what he wants when he wants.
You are assuming users visit Facebook to engage with content and that Facebook's definition of engagement is consistent with the users' understanding of it. (One example, merely viewing my brother's photo is not considered engagement but sharing or liking it is.)
I think many people would agree that a dumb pipe in this context would mean FB not ranking and filtering content. I am not saying all users want unfiltered content. I am just arguing against calling FB a dumb pipe. If anything, it is a smart pipe, or a pipe bomb.
> the algorithm that maximizes engagement is the algorithm that gives the user exactly the content he most wants to engage with.
Nothing about any system that opaquely ranks and filters content resembles a dumb pipe. Moreover, maximizing the attention content captures from viewers IS NOT the same as maximising the value users derive from content.
This is an important overlooked point. Several commenters are saying "social media just reflects human behavior." And we can get into lots of philosophical arguments about free speech, recommendations algorithms, UX patterns, etc. But a key piece here is that there was not a free market for social media or news apps; Facebook was subsidized by the major telecom company so it had a privileged position. A platform with privileged monopoly position should therefore be expected to bear greater responsibility. You can't have your cake and eat it too.
The whole situation is awful. I do wonder what the internal discussions in Facebook at the time were like. How did all the human-rights groups find themselves talking with the department of fobbing people off instead of the one that would do something. Maybe the department of fobbing people off were trying to do something but had insufficient political power within Facebook. Though I guess those discussions/emails would be very unlikely to come out except in a court case and I’m not sure what case that would be.
If you look at the things Facebook work on, it does seem that they do care a lot about not wanting various kinds of bad behaviour on their platform. I don’t know if the scope was materially narrower in the past or if this was a problem of being too disconnected from the users in Myanmar (ie language barriers or applying policies designed for the US).
I have a hard time thinking about the counterfactuals. Could this have happened with Twitter for example? It feels to me like the biggest advantage facebook had was being subsidised and if it didn’t have the advantage (and maybe if prices had gone down a bit over a few years) the same thing could have happened over Twitter. It’s not clear to me how much the government wanted the genocide to happen. Presumably the big advantage to them to it happening over Facebook is some kind of deniability and if they didn’t care about that and were sufficiently competent they could have put the hate preachers on the radio instead of letting them find their audience on the internet.
FB was much more destructive simply because it was part of FB's Internet.org initiative [1] which drove FB adoption.
> It’s not clear to me how much the government wanted the genocide to happen.
The Arakan National Party, the primary party in Rakhine/Arakan, is anti-Rohingya and its leadership participated in the 2012 riots along with the 969 movement. They entered a coalition with the National League for Democracy (Aung Sang Syu Ki's party) and even got an Ethnic Affairs Minister [2] - that they held on to during the Genocide.
That said, the issues in Rakhine were further exacerbated by the India-China rivalry, as both countries turned a blind eye to the Tatmadaw and would arm ethnic militias within Rakhine and across Myanmar in general, as Myanmar is a buffer between the two countries, ethnic issues in Myanmar blowback across Southwest China (eg. Kokang Chinese * ) and Northeast India (eg. Manipur Ethnic Violence **), and both India and China have competing defense and infrastructure projects within Myanmar [3]
A lot of this could have been avoided if Jinnah annexed the Rohingya majority regions of Rakhine into Pakistan like Rohingya asked in 1946 [4]. It's sadly another forgotten chapter of Partition and decolonization of the British Raj
* If you live in San Francisco, most ethnic Chinese in Chinatown are now Kokang. They own most of those trendy Burmese restaurants like Burma Love and Manadaly
** The Myanmar subgroup of the Kukis (Chin/Zo) have a significant presence in SF and Daly City as well.
The article claims that "Arturo Bejar" was "head of engineering at Facebook", which is simply false. He appears to have been a Director, which is a manager title overseeing (typically) less than 100 people. That isn't remotely close to "head of engineering".
I point this out because I think it calls into question some of the accuracy of how clearly the problem was communicated to relevant people at Facebook.
It isn't enough for someone to tell random engineers or Communications VPs about a complex social problem. Those people are not trained in identifying or responding to a genocide, nor do they have the organizational power or professional experience to initiate a serious reaponse.
It is saddening but not surprising to me that there was a communication breakdown.
If people can understand this article but why can’t they understand China banning western social media? In this case Facebook’s intention was to drive engagement, who’s to say it won’t be used by adversaries to selectively drive narratives?
How about we blame genocides on, I don’t know, the people committing the genocide? Meta didn’t burn down villages or shoot anyone. People did that. Are we supposed to believe those people have no agency and aren’t responsible for their actions? Like if you see a Facebook post you just have to mindlessly do whatever it says?
>Like if you see a Facebook post you just have to mindlessly do whatever it says?
A big part of this story is that it was many Burmese people's literal first time on the internet. They had built no mental defenses. They literally did not realize that things said on the internet could be fake.
Facebook dropped this tool out of the sky then neglected it. It'd be like if a weapons manufacturer dropped a massive load of guns with the rebel groups. Sure they didn't pull the trigger, but they certainly played an enabling role.
Even in the US the 2016 presidential election was a major wake up call for how social media can be used...
> How about we blame genocides on, I don’t know, the people committing the genocide
We are, yet FB's failure to moderate as well as their failure to provide a legitimate method to escalate to HQ during an active genocide show a severe form of negligence.
There are unmoderated forums all over the Internet. You can download phpbb and create one in a matter of minutes. How unbelievably arrogant to assume that you can dictate the fate of nations based on moderating an Internet forum.
Nobody decides to go kill their neighbor because they read a post online. They were already going to do that or not long before they read anything.
You have completely missed a core part of the article here: that Facebook subsidized free access to their platform, which provided the only really accessible internet given the infrastructure and cost for the rest. You labeling someone’s opinion as “arrogance” when you gloss over this fact is incredibly poor form to boot.
Facebook actively leaned into engaging with the country here and went hands off at the slightest sign of trouble, and did not take it seriously enough when people legitimately tried to escalate to them.
Facebook gave people a place to talk to each other. What they say and what they do as a result is their own responsibility. Most people just share pictures of cats.
> Nobody decides to go kill their neighbor because they read a post online.
Yes, they absolutely do. Whatsapp had to restrict forwarding in India because entire villages were being forwarded rhetoric and standing up en-masse to massacre their neighbors.
You need to understand that there is a not insignificant proportion of the population that will literally do whatever the television (or a convincing written socmed post) tells them.
Also not all cultures / societies are at the same level and educated about these things. Letting something like FB loose in a world where people don’t understand algorithms and targeted information campaigns should’ve have obvious consequences to those building these platforms.
People have free agency but are also susceptible to outside influence and propaganda. Nobody is a truly rational agent -- otherwise they wouldn't be committing genocide.
I live in Myanmar. I started my software company at 2010 , as soon as we gained democracy and here are my take as a citizen and a founder who had gone through various stages of the country.
The article is more about bashing Facebook and its algorithms but the ground situation is not much due to Facebook at all. The main problem is political players are using Racism , Nationalism , Brainwash , Multimedia as a tool to induce instability so they can go back to non-democratic country , just to take back our freedom.
The violence aren't real cause by hatespeech on the Facebook. Sure it increased Racism by a lot but also it leads to find out about truth fast.The actual genocide ware done by the Junta military and Junta assigned thugs who are infused within riots are by a group known as Ma Ba Tha , which was known before as Swan Arr Shin ( Meaning Super Heros) - which ironically used to kill peacefully protesting monks back in 2007 (Saffron Revolution), Now they are known as Pyu Saw Htee who are killing innocent people now regardless of Race or Ethnicity in suspicion of supporting Spring Revolution.
They are Ex Junta , Prisoners , low ranking members from the Rival Hardline Military Party lead by extremist nationalists . They are the one who raze and displaced millions of Rohingyas.
The start of Rohingya crisis :
- At first when the news breakout that innocent girl was raped and killed , the uncensored mutilated body of the underage girl along with caught preps had been posted online who are identified as rohingyas, that is the first time many people had seen violence on the Facebook and it was spread like wild fire , and there was a lot of hate online.
- And then. The MaBaTha movement started ( the group i mentioned above) , goes on ground , spreading the image of the post , they organize and formed other nationalists , and then they started using monks , most of them are military spies robed and planted into there since after Saffron Revolution - as a tool for political play.
- Soon after they started using monks many real Buddhist are shunning away and stop following as soon as they started using Buddhism as a tool for violence which is totally against buddha's ways and it become appearant that is a political play.
- But the riots were organized by military , in Meikhtila case , polices guards the people who burned the whole town to ashes - who are Later Identified as MaBaTha .
So here is take away
- The crisis is totally fuled by junta nad organized by Junta Swann Arr Shin group ( MaBaTha , and now Pyu Saw Htee) (The organized criminal group , members existed since 1988, used again and again in 1988 , De Pe Yin massacre , Saffron Revolution) .
- If there were no Facebook , the Junta would use state owned media and journal outlets and would have the same effect but because of Facebook we have a chance to speak-out , we have a chance to find out truth, report massacres .
- Free Speech is very important for us , for a country who had lost freedom for 70+ Years
- Free Speech and Facebook has to do very little on Rohingya crisis since it is mostly done by the on-ground , organized criminal .
- Facebook Algorithms boost controversial topics that is undeniable
- Rohingya genocide was a military sponsored terrorism it was fueled by Military Junta and hard liners and organized criminal group founded by junta.
- When the crisis had been controlled by Democratic government , the military start to lose power so they stage a coup in 2021
Now situation is a lot worse
The Junta who organized the Rohingya crisis , stage a coup and killed over 5000 innocent civilians of all races and genders , and that had lead to people arming up and fighting against Junta .
Details of which i couldn't say much here , because of safety reasons. Please contact me if you want to know more.
My telegram is at @Pyr0X . I can already be tracked easily so I am not going to post email or social media that can be easily tracked.
I am already at existential dread saying it.
Unfortunately, this sort of glibness misses the fact that Meta’s algorithms push topic engagement and in doing so, amplify high-emotion content.
A passive user of the general internet is not as likely to encounter the same concentration of singular topics as they would on Facebook. Your comment would largely apply only to active seekers of said content.
I don't think it's that simple. "The algorithm" historically favoured engaging posts, which basically means "what people care about".
This same algorithm pushes all the good things that someone like Wirathu was doing (for the Buddhist population) as well as the horrific things (promoting genocide).
So let's talk about the obvious -- disabling "the algorithm". With no algorithm the problem might be even worse! If someone like Wirathu runs 20 accounts which each post 10+ times per day, his genocidal speech might be even more over-represented to the general populace. Timeline order rewards spammers, which rewards people with resources to simply scale output to more of the same.
Well ok, why not improve the algorithm then? It's easy to say "just promote the good stuff", but it's very difficult to do in practice at scale. Let's examine why.
Since it's all happening in Burmese, any ML models for sentiment or radicalization or hate speech will hugely lag behind English in effectiveness. A poorly trained auto-moderator is often worse than none at all, because it acts at random.
It is also a cat and mouse game -- simple models are stymied by basic (for humans) techniques like misspelling, swapping letters for symbols, inventing new slang words, dogwhistles, appropriating benign words, etc.
Ok, so why not scale up moderation? Well, they need to find sufficient Burmese speakers who can be in person in their offices. (In person because of data protection and resources like mental health support.)
It's not a popular language, so finding any candidates is already difficult, let alone people who are willing to wade through the horrors of humanity day after day for a paycheck.
Why not stop providing services if you cant scale the safety mechanisms that go with them? If it's hard to scale moderation in Burma, don't do wrap ups with Burmese mobile operators to put your service on everyones phone for free.
I'm not sure, but let me try to provide some reasons.
Firstly, Facebook (and the Internet in general) is used for hugely positive things in most people's lives. Stopping service would have a negative impact on most people. That might even include the Rohingya, who could use social media to organize for their own safety.
Secondly, it was not known beforehand that there was going to be a genocide, so the services were there at the same baseline as other countries and languages.
Once it's there and the people love it, I can imagine that taking it away would only push everyone to another platform. That might solve the problem for Facebook, but it wouldn't solve the problem for Myanmar. The article even mentions an alternate popular news site in Myanmar that also incites violence.
As for why provide services for free, are you suggesting that the people of Myanmar would be better off without free access to a subset of the Internet? I think their actions, and the actions of the developing world in general, speak differently.
If I remember correctly, Internet.org doesn't just provide Facebook access, they provide a handful of sites that include Facebook but also include Wikipedia and others.
>Secondly, it was not known beforehand that there was going to be a genocide, so the services were there at the same baseline as other countries and languages.
Except that Facebook employees were told directly that the country was on a path to genocide and Facebook was helping to fuel it. They had the warnings.
Between "do nothing" and "withdraw the service" there is a third option where Facebook does a good job moderating the platform so good effects are promoted and negative ones are limited. As the monopolistic operator subsidizing internet access for a majority of the population, they have a greater responsibility to avoid doing harm, and yet failed despite repeated warnings.
If they can’t responsibly provide a facility then they shouldn’t provide it.
I don’t even mean just accidental issues. In this case they were warned over many years and were negligent.
They do not care to provide a responsible service, because a responsible service reduces engagement.
Their older algorithm was much more scalable with a void of moderation. It used to favor recent shares among your most interacted with friends.
While that doesn’t prevent group think, it does reduce amplification of emotionally charged posts. Today, a post that has a lot of engagement, even across a wider social network, is higher on your feed. That means that if I have some crazy acquaintance that I never really talk to, the second they start posting anything high engagement, it’ll come to me. Previously it would understand that I don’t care about them.
Meta chased the tail of engagement and don’t care what the second order effects are.
I don't think it's correct to assert that a different algorithm would be less harmful, and it could even be the opposite.
In the article, it says that Wirathu used multiple personal accounts to "friend" at least thousands of people. I don't know to what extent his reach would be diminished by favoring "friends" -- and it might even increase his reach.
That’s why I specified their older algorithm where it prioritized people you actively engaged with.
Even if Wirathu sent me a friend request that was accepted, unless I actively engaged with him, his content would fall off my feed.
So it wouldn’t eliminate his reach but it would have reduced amplification. Today, because he’s popular, Facebook would prioritize him on my feed.
In essence, Facebook went from saying I should care about my close friends at school to caring about the most popular cliques. (As an analogy, I’m not in school of course)
I do agree with you that it’s not as trivial as any of us are writing here, but it’s also clear imho that facebooks current algorithms have harmful incentives in the face of inadequate moderation.
Why is your assumption that the alternative is no curation? Why can’t it be that their curation is bad or has the wrong metrics.
The problem with Meta’s algorithms is that they overindex in single topics with high emotional value. That means you end up getting into unhealthy echo chambers. And unlike other social media (Twitter, TikTok etc) this ends up localizing to your immediate social circle.
The end result is that Meta has time and time again been shown to amplify the most charged positions.
Their curation algorithms are bad. They don’t account for second order effects, only engagement.
Sure you can replace Facebook with whatever you want and come up with some scenario that will lead to a genocide. I bet this genocide would happen at some point regardless of facebooks involvement. There have been plenty before it. But the fact of the matter here is that in this case it _was_ facebook.
There was a pre-existing prejudice towards muslims/rohinyga in Burma/myanmar by the dominant Buddhist population. As technology developed, specifically mobile phones, it became easier for anti-muslim voices in Myanmar to spread their message. Specific violent acts against Buddhists, such as rape/murder, were, without proper evidence, attributed to the Muslim minority, further stoking prejudice and hate toward the Muslim population. Facebook was warned on many occasions that users attributing the violence against Buddhists to Muslims was leading to calls for violence against the Muslim minority, with multiple meetings/discussions by Myanmar and western sources at Facebook hq itself, only for Facebook to not take any action but instead focus on increasing engagement in Myanmar regardless of the situation on the ground. Slippery slope ensues contributing to the death/genocide of potentially hundreds of thousands of Rohingya/Muslims in Myanmar. More evidence in part 2 which seems like hasn’t been posted yet.
- There was a pre-existing prejudice towards muslims/rohinyga in Burma/myanmar by the dominant Buddhist population.
No , i am 40 years old and before Rohingya crisis we have no hate towards muslim but have to admit we don't even know Rohingya race exist at all . The general populace is very peaceful towards muslim .
- it became easier for anti-muslim voices in Myanmar to spread their message.
The anti-muslim voice are not from civlians , they are from Junta and Hardline Junta who lost power due to democratic transition and the Hardline Party called USDP . So it is political as evident now as their failure and lead to coup.
- Facebook was warned on many occasions that users attributing the violence against Buddhists to Muslims was leading to calls for violence against the Muslim minority, with multiple meetings/discussions by Myanmar and western sources at Facebook hq itself, only for Facebook to not take any action but instead focus on increasing engagement .
Deaths and killing are done by military junta , with and without uniform - to look like civilians and their sponsored grouped called MaBaTha .
My apologies. I was just trying to summarize the op’s contention about what’s transpired with Facebook there. I have no reason to doubt what you say is mostly true, but do you think that some of the general Buddhist population, beyond the Junta/leadership/dictatorship, had anti-Muslim/Rohingya sentiment before Facebook was widely available and so it was those voices who were amplified when Facebook’s growth accelerated in the country? The primary person the article cites was a Buddhist monk from what I understand.
It has been long-standing government policy to deny the existence of Rohingya as an ethnic group, claiming that they're immigrants from Bangladesh, in order to deny them citizenship; this being a manifestation of pre-existing prejudice towards Muslims/Rohingya by the dominant political groups, both military junta and democratically elected civilians. (Though I think attributing this to Buddhists in general is just as much a mistake as attributing a crime to Muslims in general.)
So it's unsurprising if you hadn't heard anything negative about Rohingya, but maybe you heard something negative about ကုလား or some other moniker meant to deny them recognition.
Aung San Suu Kyi herself famously refused to even utter the word "Rohingya" while defending her dear generals against the genocide accusations, so I don't think you can put blame only on the junta/USDP and absolve the NLD from all responsiblity.
Recently, the NUG has started making noise about finally reforming the citizenship law, so maybe the situation will improve in the future, but I could also see them renege on their promises in the event they win the civil war and are no longer as reliant on international support.
> Aung San Suu Kyi herself famously refused to even utter the word "Rohingya" while defending her dear generals against the genocide accusations, so I don't think you can put blame only on the junta/USDP and absolve the NLD from all responsiblity.
Well , Whole internal affairs is under Junta even tho NLD are elected. Guns are pointed at her , and examples are given to U Ko Ne . She don't have much choice.
I might be the most naive person on Earth for thinking this, but I can’t imagine any circumstance where this was anything but neglect on FB’s part. Or maybe neglect isn’t the right word. Maybe FB, like so many tech companies, optimized for the wrong metrics. So it’s more proactive than neglect? If your audience is a vampire and you want to get engagement you show a lot of bulging veins. I don’t mean to suggest that the broader Buddhist population (ie most of Myanmar) is bloodthirsty, but if a tiny (tiny!) slice of it had hate on the mind it’s not hard to understand the argument that FB amplified the bloodlust (wow, look at the spike in engagement on the internal dashboard!). Chart goes up and to the right and some PM gets paid. And it’s super bad when it’s the ruling government (the Junta in this case) that’s the vampire, as a sibling comment contends. All so brutal.
Free speech is important. People don't get mad at air because it allows people to transmit their thoughts. People don't get mad at pencils because it doesn't censor what people write. Internet companies transmitting people's ideas is no different and wanting to censor people is against the principles of a free society.
Facebook is not free speech, nor does it merely "transmit" speech. It is a machine for algorithmically amplifying speech.
Facebook is not a pencil. Notice the article does not criticise mobile phone vendors for including keyboards in their phones.
Who do you think was pushing to subsidise data usage for Facebook? And to what end? Air doesn't do that.
I completely agree that free speech is important. I'm sure the people of Myanmar agree with that too after decades of regime rule. But it's irrelevant to this discussion.
I am on the ground and let me say this.
I am Burmese , I am from Myanmar . I born there.
Genocide is done by the organized criminal group sponsored by military junta. I totally disagree with freespeech cause the crisis.Junta is the one responsible for mass killing and genocide.Freespeech is helping us to fight junta .
I am risking too much my self to say this already.
Freespeech is what they afraid a lot.
Even with algorithm , Facebook is platform to deliver our words. I wish there are non-algorithmic , un-mainipulatble , non-moderated platform that can be reached out to the world.
I've seen all your replies in this thread and appreciate you sharing your viewpoints. I understand your points and there is one thing that needs to be said in response:
All of the stories and citations in this article also come from Burmese people and international monitors on the ground in the country.
So parallel to your claims that Facebook is irrelevant, there are other people working in this space with equally valid claims that Facebook does bear some responsibility. People who saw the situation developing and tried to warn Facebook, only to have their pleas fall on deaf ears.
My heart goes out to you. I can't imagine what you and your country are going through. I pray that your words will win out over those of hate and ignorance, over force and violence.
While air is a passive requirement, I would suggest that writing, the printing press, speakers, radio, television, and the web are all more similar. They are intentionally used to increase engagement of content intentionally created.
I respect agree with your belief that freedom of speech is important, but I disagree with your conclusion that "Internet companies transmitting people's ideas" is no different. The key idea here is that freedom of speech is not the same as freedom of reach: Facebook might give everyone the same text box to write posts or upload content, but they need to answer for HOW and WHY they recommend certain content over other types of content.
The comparison pencils or air isn't fair because they aren't being boosted by a recommender system, powering the application layer of the content delivery platform for 90% of a country's people. If I write a message on paper containing a call to action which incites violence (e.g. "We should round up all people with six toes on each foot and shoot them") and hand it to one person, the message has reached one person. I would say, the digital equivalent to "pencil and paper" or "talking in a public square" more like "Writing a message in an HTML file and putting that HTML on a web server you own." I think it would be much more difficult for your average user in Myanmar to reach millions, even if they were given a web server and the means to write HTML.
Comparing it to "getting mad at air because it allows people to transmit their thoughts" equates Facebook the Internet, when one is an organization of people who provide technology and the other is technology. There are data scientists and engineers who created the models which learned how to optimize for engagement, and there are product managers who help set the KPIs for those engineers (i.e. encouraged them to solve for engagement) and there are execs who benefit from all of the above. While no single person is solely to blame for what happened in Myanmar, it's wrong to claim that we can't at least reason and discuss the products still created, maintained and own by company (made of people).
My opinion on this is heavily formed by the training I received when I wanted to DJ for my university's radio station. Even in the US, where freedom of speech is protected in the Amendments to the Consitution, there were very specific things you could and could not say on the air (while being classified as a non-profit radio station). You could receive massive fines for profanity, obscenity, inciting violence, hate speech, with a large general emphasis on calls to action. Generally, I think this training was good and it encouraged everyone who learned to DJ to be more intentional with their language.
I really think that if Facebook is gonna enable users to reach millions, there absolutely should be some caveats to that. People should be able to say whatever they want if they are capable of providing the means by which their own speech is amplified--and even then there are clear cases where the megaphone needs to be taken away.
Facebook is giving megaphones to people and some people use it to dispense hate speech. While I might've been willing to give Facebook a pass in their early years, it is clear that between Cambridge Analytica and this that governments around the world need to do better when it comes to working with Facebook to better identify content on their platform which incites violence or calls to action which present a clear and present danger to other people if acted upon. Facebook isn't some amorphous inanimate entity that we can't negotiate/regulate/punish. If someone uploads a hateful message which has a 0.00001% of encouraging a hate crime in Burma, Burma can expect around 5 hate crimes as a result of this message. With great power comes great responsibility.
>Facebook might give everyone the same text box to write posts or upload content, but they need to answer for HOW and WHY they recommend certain content over other types of content.
It seems that Facebook's "crime" is not censoring people enough and not not their reccomendation algorithm was configured to boost an opinion the article's author did not agree with. Since the problem is with the contents I see it as relating to free speech.
>Even in the US, where freedom of speech is protected in the Amendments to the Consitution
While the constitution makes it hard to write laws that limit speech the US does not have full freedom of speech.
FB could set the baseline visibility level of each post and comment to 0, and selectively boost specific ones to be seen by people other than the poster. Not boosting, is not censorship after all.
Liberty is not an absolute, positive liberty is one thing, negative liberty can't be accepted, we mean for positive liberty the ability to act within the boundaries of the liberty possessed by the rest of society, the ability for libertarian to be idiots affect themselves only, but until their idiocy doesn't become negative liberty, and for negative liberty we mean that set of liberties that go to affect the liberties of the rest of the society, saying that we should fucilate all libertarians is a negative liberty, because it spreads hate and put someone else in the society in danger and so my liberty directly limits the liberty of someone else
It's the safest thing we can do to let some violent "ideas"/"ideals" spread, and then acting when they're acting on it only no? We've seen it with Nazi on Africans, Gays and Jews (alphabetically ordered), it ended up so well.
I'd say, the freedom of speech can't contain concept that put third party in danger, or that could limit the liberties of others, because bad ideas have always become sufference
...and I guess those part of the minority who instead would like to live in safety and health can also go somewhere else, right? Because how free is a society that hasn't come out of the clockwork orange
I think we can't go anywhere with our conversation, also because you've ignored the part about "putting someone else in danger"
Edit : to give people who don’t bother to read the article some idea, many villages were burned down and before that happened , women got raped and killed , the entire village were looted and then killed and then they burned down the villages. The goal was to create fear so that the victims would have no choice but flee their own country. It worked .