Hacker News new | past | comments | ask | show | jobs | submit login

I don't want my browser, its vendor, or even my search engine to decide what is or isn't "disinformation", or "white supremacy", or whatever today's distasteful views are. I'm an adult human, and I'll do that for myself. If someone wants to make a Fisher Price browser that only shows approved things, go for it, but I'll start looking elsewhere.



The article doesn't suggest at all that Firefox or Mozilla are looking to limit what you're shown by your browser.

3 of the 4 points involve making the reasons you're being shown something more transparent.

The remaining point suggests that other platforms make changes to their algorithmic content feeds. These platforms are already deciding what you see in those feeds and what's "approved".


> I don't want my browser, its vendor, or even my search engine to decide what is or isn't "disinformation", or "white supremacy"

Good! Mozilla isn't saying that browsers should do that at all. I don't know how you got that from the article.

> I'm an adult human, and I'll do that for myself.

The entire problem is that you will never be able to do that, in the context of social media. Simply by using YouTube, or Facebook, or whatever other social media feed, you've already trusted that website to decide what you see. The alternative is you having to sift through hundreds of thousands of social media posts yourself, and again, humans can't really do that.

So (unless you have a better solution) the billions of people who use Facebook are at the mercy of its ranking system. Shouldn't we thus demand that Facebook consider the societal consequences when they decide how to rank things? Or should Facebook do whatever they want when they decide what billions of people get to see?

That seems to be thrust of the "disinformation" line you're referencing:

> Turn on by default the tools to amplify factual voices over disinformation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: