Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Seems like you more than anyone would see that solving the types of problems FB is trying to solve eg: freedom of speech vs user safety / harm reduction is not some super simple problem, no? I wouldn't call Reddit evil despite the fact that many powermods are both amazing contributors doing free labor and curating great communities while also simultaneously abusing their power every day to silence people they disagree with, shaping narratives in human culture, automating blanket unappealable bans on users for participating in unrelated subreddits (even if you were participating in that subreddit in order to combat its views), making snap judgments on content moderation that might ruin someone's day when they make a bad call on a ban or delete, or unilaterally self appointing themselves as mouthpieces for their broader communities via subreddit blackouts or preachy pinned posts.

It's unfortunate that when you build a product so close to the ground of human communication and human nature you're never going to be able to get everything right, and you're no longer solving technology problems alone but trying to basically combat basic human moral failing itself. We don't ask that of the telephone company.

^ That being said, we can only excuse some of their failures with the above line of thinking. Others we can blame on greed or recklessness, or ignoring the social costs of something like ML recommenders optimizing for engagement. Not sure if those things deserve to be called evil, but I'd still hold back personally. Misguided, overcome by greed, or reckless, perhaps.



Point of order: the issue with Facebook is the various engagement algorithms that they are and have been perfecting. This is unlike anything humans have ever seen before. We are no longer anywhere near to 'the ground'.


Yeah, there is a big difference between Reddit and Facebook in the above comparison. All the examples of issues with Reddit can more or less be attributed to specific people and fall more in line with "bad" human behavior. Facebook's algorithm is something entirely different in it's design - it's primary objective is to manipulate the behavior of the user on the other side, and what it chooses to show or not show doesn't follow any human line of reasoning outside of some loose built in "safeguards" and unenviable content moderators meant to serve as the guardrails.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: