Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> basic 'hot' algorithm that's not designed by squadrons of PhDs running machine learning on user behavior analysis isn't really the same thing, culpability-wise, imho

A "hot" algorithm can blast high engagement libel in front of people as well as a fancy algorithm.

If the "hot" algorithm and the fancier algorithm are both content neutral, on what basis can you distinguish the two as a matter of law?

Does the hot algorithm become illegal if a PhD implements it? I'm at a loss about what distinction you're actually trying to draw.

Your post, like many others on this thread, isn't articulating exactly what about FB's conduct you find objectionable?



You keep putting words in my mouth, I'm not saying anything about outlawing anything, just assigning liability.

> Your post, like many others on this thread, isn't articulating exactly what about FB's conduct you find objectionable?

The objectionable conduct is pretty thoroughly covered in the article, but to review:

> "Facebook over and over again has shown it chooses profit over safety"


Illegal, liable, doesn't matter --- you want to use the state to drive certain kinds of ranking off the internet.

Fine. What, precisely, is the line between algorithms acceptable to you and algorithms not?

What is the precise conduct that would make liability attach to a ranking algorithm? You can emote, but you can't describe what exactly it is you would turn into a law.


> You can emote, but you can't describe what exactly it is you would turn into a law.

I thought I made it pretty clear from the outset I was talking about removing CDA Sec 230 protections for sites using bespoke (i.e. proprietary) curation algorithms for their feeds.


So any filter algorithm would be okay if open source?




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: