Hacker News new | past | comments | ask | show | jobs | submit login

> If you meet anyone who disagrees with any of that, let me know

Facebook appears to have disagreed with that; they amplified calls for ethnic cleansing and did not respond to concerns about it, so they must have believed that asking them not to was too much. That's the point.




That point hasn't been made. That is why I'm putting some time into this comment chain - we've got the journalism thing happening where people did terrible things, and people used Facebook, and then the journalist is painting Facebook as guilty by association without saying much specific that provides a link. And then letting low-empathy readers join the dots without considering what the people involved were likely thinking.

Bad people use Facebook. We don't need evidence to know that. This article is strong evidence that very bad people use Facebook, but it isn't at all clear that Facebook should be considered morally involved based on what has been presented seen so far.

Maybe the killing blow is yet to come. But I'm pretty sure any objective standard that gets Facebook in trouble here will get them in just as much trouble for letting Victoria Newland or US 4 star generals post publicly. There are a lot if brutes in public office.

Furthermore getting involved in matters of war and peace is not a role that Facebook will get praise for, it'll do some really terrible things if it goes down that path. They should be biased towards inaction. Even and especially if they care.


Yes, this may have happened anyway. Yes, Facebook is not fully responsible. But I disagree with you. The lines are clear.

Facebook de facto became the internet in a country of ~50 million people through subsidising their platform through free data access.

Their platform was developed in order to further their own goals - through maximising engagement and monetisation.

The second order effects of their own personal ambition was enabling people like Wirathu to reach hundreds of thousands of people with hate speech and calls for genocide.

Facebook were informed of this multiple times and allegedly, did nothing about. During this time they had 1 Burmese speaking moderator.

Stating that they have no moral responsibility for the consequences of their actions is in my opinion horseshit. But it does align with certain aspects of the current American zeitgeist of entrepreneurship, free speech and platform "safe harbour" regulations.

This is not a view shared everywhere and should not be assumed when American tech companies scale out of the US. Thankfully this dogmatic approach is being regulated by the likes of the EU and other countries so these platforms are more aligned with their own moral frameworks.

Personally, I find Facebook absolutely morally responsible for parts of this. Just through the simple fact that provided a platform for tens of millions of people - with severely lacking moderation - all in the chase of growth and profits.

This isn't exporting "freedom and democracy" to the world like the good old days. This is abhorrent profit maximisation with no regards for the consequences of their actions, hidden behind a thin veneer of moral rationalization.


I upvoted several posts of yours in this thread but man, are you implying FB was not actively promoting one side? I don't even believe it was "the algorithm".

Stuff like this does not happen by accident, nor in a vacuum. The only reason we don't hear more about this is that important people don't want us to.

Don't play the naivety card in topics like this.

Edi: downvote all you want, you horrible apologists. FB is a weapon and you know it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: