Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think this depends on how you consider a "conspiracy" to be organized. If a conspiracy is a bunch of humans making explicit decisions about each of these incidents, it's laughable.

However, if the "conspiracy" is in fact a bunch of automated tools created by a bunch of humans who all have similar biases in the same ideological direction (biases which happen to be ascendant on elite college campuses and/or in the Bay Area), then this is a fairly reasonable belief.

When an incident like this happens, the outcry is great enough that the humans, who are themselves still human and capable of compassion, can reinstate accounts and override the automated system. But how many "failures" such as this occur where the affected parties simply have no recourse because they don't have the privilege of 60k simultaneous streamers watching?



Intent matters. It's not like the secret Bay Area college elite cabal all sat down one day and one of their secret meetings and decided to write

    if streamer.is_black() or streamer.is_woman():
        streamer.suspend()
into the algorithm. That's an irrational belief. But because of our racist past, present, and future, when mysterious ML models come into play, it's entirely possible for that to happen, and frequently. But unless you believe the above code exists somewhere and was written by a human, it's a bit much to call that a conspiracy. Which is probably why it's written in quotes.

Thus, the problem to raise isn't that parties with no recourse might be silenced. That's absolutely an issue, but clearly the mere argument hasn't gotten YouTube to sufficiently change their ways. That's because that conspiracy doesn't exist. The argument to make is that it degrades YouTube's content as a whole. There are a number of other Internet video hosting platforms, most recently TikTok, and it behooves YouTube to spend more time adjusting their algorithms simply because content creators are going to create a little something with the hope to go viral, then get their content taken down, and then move to another platform, rather than say anything about it to more than a couple of real life friends. TikTok's ascendancy (and Vine before it) proves that YouTube's dominance is not assured. The right winds of change could result in YouTube to be displaced, and YouTube's "algorithm" being incorrect too frequently with takedowns is sure to play a role.


>However, if the "conspiracy" is in fact a bunch of automated tools created by a bunch of humans who all have similar biases in the same ideological direction (biases which happen to be ascendant on elite college campuses and/or in the Bay Area), then this is a fairly reasonable belief.

The conspiracy theorists just don't know about bad automated decisions made in the other direction because it's suppressed in their echo chamber.


Basically. The vocal right and vocal left are both convinced that big tech is biased against them and in favor of the other, when in reality, they're mostly biased against controversy which would hurt them with advertisers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: