Hacker News new | past | comments | ask | show | jobs | submit login

I'm actually glad to see the conclusion. Everybody wants to censor what they don't like. There are people who believe free speech is not the ultimate value, but rather to protect people from 'harm'. In this case 'harm' being ideas they don't agree with.

The best stance is not to take a side, but to make sure both sides are civil in the expression of their beliefs.




> "Everybody wants to censor what they don't like."

This should be top comment on all HN discussions involving content moderation, so people can read it before they respond and think about whether what they are advocating makes sense. People these days so effortlessly are able to make that huge leap from "I don't personally like this thing" to "This thing must be suppressed/illegal for everyone."


It's Kant, essentially:

> Act only according to that maxim whereby you can, at the same time, will that it should become a universal law.


You can say that, but from the beginning, governments have put limits to speech based on "harm" from speech. Slander, libel, threats are all illegal due to the contention they cause harm. Taking no side is also similar to this, taking no side in a conflict as long as they are civil requires one to decide what is "civil" in the first place. Are comments about race considered "threats" if one person is of the race targeted? How about coarse comments like telling someone to "kill themselves." It isn't so simple and even being neutral is a stance of a kind.

BTW, I don't think being neutral in a conflict like censoring or not is the wrong stance, but it IS a stance with its own value judgements.


The difference between censorship and all of those things is that they aren't prior restraints. Nobody filters what you say to make sure you aren't saying something defamatory. You say it first, then someone has to take you to court, where you have the opportunity to defend yourself.

And somebody has to care enough to shut you up to go through the effort of taking you to court, and then they have to win.

Some kind of private sector prior restraint apparatus making automated censorship decisions with no due process... is not that.


I feel like you potentially have a point but the moderation the slides talk about are also posterior to the speech. For example, tell someone to kill themselves, get reported, get shadowbanned. Deciding what is "wrong" or "uncivil" or "out of bounds" is the value judgement, like deciding defamation is or threats are.


> I feel like you potentially have a point but the moderation the slides talk about are also posterior to the speech. For example, tell someone to kill themselves, get reported, get shadowbanned.

When the penalty is censorship of future speech, it's still a prior restraint. And shadowbanning is obviously not compatible with any kind of due process or even an opportunity to know that you've been accused.

> Deciding what is "wrong" or "uncivil" or "out of bounds" is the value judgement, like deciding defamation is or threats are.

Which is why the traditional categories have been narrowly drawn and limited to things that are as apolitical and non-partisan as possible.


I mean, how is making libel unlawful not censorship of future speech under your definition, or making threats unlawful censorship of future threats? How is this different?


The difference is that the censorship happens before due process rather than imposing punishment only after the plaintiff proves their case.


>>You say it first, then someone has to take you to court, where you have the opportunity to defend yourself.

What if you say it and someone is seriously harmed or dies as a result?

The fact is that we have prior restraints for all kinds of things because we realize they are dangerous and have the potential to cause harm. I don’t see why speech should be different. Words can kill.


> What if you say it and someone is seriously harmed or dies as a result?

Frankly, that's the price of freedom, and it's the reason that fighting to maintain our freedoms is a never-ending battle.

Allowing the public to own and operate cars likely results in more deaths from crashes, but it also allows people to travel to and from arbitrary places on their own schedule.

Allowing people to own general purpose computing devices allows people to develop harmful software tools, to route around safeguards, to communicate clandestinely about illegal activities, and to design CAD files for 3D printed guns. Of course, it also brings us the ability to build software that works the way we want (or need) it to, to build successful businesses, to preserve ephemeral cultural history, to expose official corruption, and to organize political dissent through encrypted back-channels. You cannot have the benefits without the risks.

Allowing people to speak freely may result in someone feeling emotionally attacked, it may result in convincing someone that a false idea is true, or it may give people a flimsy excuse to engage in physical violence. But the benefits are innumerable. Unrestricted speech allows us to voice our concerns as citizens and participate in the political process without being silenced. It prevents the powerful from using concerns of "safety" to suppress dissenting opinions. It serves as a safety valve that helps potentially dangerous ideas rise into the sphere of public debate, where they can be taken apart (or even just "rounded off") before they result in something like genocide.

You can't stop speech, period. You can prevent it from happening publicly, on a temporary basis, but in the long term that kind of repression leads to violent revolutions.

That said, there must be some limits; I cannot run you over with my car or use my computer to hack the Pentagon, and I cannot threaten your life. But these limits must be explicit and narrowly constructed or they will be abused by the powerful. That means that there are necessarily "edge cases" that have to be decided by a fair and impartial process! This is unavoidable, because otherwise some people will continually step just over the line and then claim innocence. There is no way to "fix" this; it is always a messy process because of the enormous complexity involved in the world.

It's always tempting to trade freedom for safety, especially for victims who may not have suffered as much in a less free world. But we have to push back against that sentiment, or the future will not be worth living in.


It serves as a safety valve that helps potentially dangerous ideas rise into the sphere of public debate, where they can be taken apart (or even just "rounded off") before they result in something like genocide.

How's that working out lately? I think ought to look at the way social media has already become a vector for genocide as in Myanmar. Here in the US we're currently allowing some kinds in immigration detention who have been separated from their parents to be permanently adopted by American families, which (by the very international standards we helped to establish) is a massive human rights violation.


> How's that working out lately? I think ought to look at the way social media has already become a vector for genocide as in Myanmar.

One of the only ways the public (in the west) was able to find out about this was through the use of social tech. It just wasn't on anyone's radar before that. Once they are aware, people can bring pressure to bear to stop the genocide, which has been happening.

I also believe that the role of social media as a "cause" of genocide was overstated. That said, I do think that modern social media is flawed and won't last too much longer in its present form.

> Here in the US we're currently allowing some kinds in immigration detention who have been separated from their parents to be permanently adopted by American families, which (by the very international standards we helped to establish) is a massive human rights violation.

How is this related to the discussion?


>How about coarse comments like telling someone to "kill themselves.

I think a few conservatives were directly told to do just that due to the SCOTUS appointment furore --have there been significant calls to censor those opinions?

It's like everyone is for Censorship unless they are the targets of Censorship. And in this case they are not calling out their own lot very vehemently.


Stuff like this is a huge indicator that whatever everyone's intent going into censorship, it will invariably end with powerful, entrenched stakeholders using the censorship tools to dig in and start purging outsiders and members of opposing factions. In politics, business, culture, you name it, this will be the end result.


Yep. That was exactly my argument a month ago here:

https://news.ycombinator.com/item?id=17926316


People universally believe they are better than their historical counterparts, that they will succeed where history has failed, that this time it will be different. When really all they're doing is carving another notch on fascism's belt.


[flagged]


It's too bad we're not in a bar. I could talk about this all night, and I've been thinking about it a lot.


Cheers. Come up to Alaska and I'll buy you a drink and take ya fishing while we're at it.

The recent heavy push for greater ideological censorship will end up driving centrists to places that cator to extreme content. This is one aspect of the problem which has led to further increased polarization in modern American politics.

Facebook, YouTube, and Reddit are too censorious and do not fill the role of a modern free speech platform which must exist. The biggest problem is gaining traction while a large portion of early adopters are extremists banned from other platforms.

My only hope is that the censors will piss so many people off that real free speech platforms like https://d.tube/ can reach critical mass.


Who is this 'they' you're referring to, exactly?

And what opinions do you hold that would warrant this 'they' calling you fascist?


I've been called fascist for suggesting that:

Hiring decisions should not be based on skin color.

Hiring decisions should not be based on gender.

The irony is that these were all celebrated by the left previously. Now they are seen as fascist if you aren't discriminating against white men.


Supporting free speech over hate speech legislation for example.

Or the leigons of people who claim Trump or any one supporting him is a fascist. (I did prefer Rand Paul btw)

I also had people in /r/linux calling me a russian bot and a racist because I am against the new code of conduct.


Trump very well may be fascist. He seems to want America to be more like North Korea anyway. And what was the context of your comments? Maybe copy/paste? Genuinely curious.


The left, they refers to left. If you are that clueless.


I believe I saw a headline today or yesterday regarding someone being fired due to remarks along those lines. Perhaps not "kill yourself" per se. I think the remark was characterized as a call for assassination.


I think what you're referencing is probably that somebody was recently put on administrative leave pending investigation for tweeting "whose gonna take one for the team and kill Kavanaugh?"

https://www.dailymail.co.uk/news/article-6255241/Special-edu...


The google head of design told Repubs to all go F* themselves to Hell (not even qualified it like, "oh, the bad ones, I don't like, nope ALL). I guess that's not so bad (for an EXEC at the very company considering Censorship). But anyway, calling for assassination is different from asking someone to off themselves.

[See if here]: https://yournewswire.com/senior-google-employee-wants-republ...


I don't know, you might want to talk with the woman who was arrested for laughing at Jeff Sessions a short while ago.


That person has a long list of prior arrests for disturbances [that shows a pattern] The case was also dropped.

That said, though I don’t like Dick Cheney, when thd Katrina protester told him to eff off, he took it on the chin like a proper guy.


Ah, so you're saying as long as we deem someone socially unfit that therefore it's OK for the government to censor them.


No, I don’t agree with her arrest, but i can see why they might.


Slander and libel are not illegal, as far as I am aware.


They are torts and, if provable, warrant civil lawsuits.


Perhaps the right word was "unlawful." Alas, IANAL.


I'm speaking for myself here but I think the Russian bots as well as astroturfing bots need to be censored. And not censored by government regulation but by the terms of service. There are many "active measure" Russian bots active on twitter. Twitter doesn't seem concerned at all mostly because those Russian bots are padding their numbers same goes for FB and reddit.


That sounds all fine and well but half the time I utter a conservative opinion I'm accused of being a "russian bot".


If twitter/FB/reddit had done a better job with banning spam bots originating from the GRU offices then you wouldn't be getting called a Russian bot. The effect of Russian bots is multi-pronged because it dilutes the discussion into figuring out whose is a person and who is a bot.

If anybody wants a good article to read on their commute: https://www.nytimes.com/2015/06/07/magazine/the-agency.html

This is a 2015 investigative report digging deeper into Russian bots. Attempting to find some answers.


This happens to me too. I think people do this so they can dismiss ideas without having to refute them.


We actually have a good word for this: McCarthyism. If I don't like your opinion, you are a Soviet agent!


That's exactly why they do it.


> half the time I utter a conservative opinion I'm accused of being a "russian bot"

It's not a "conservative" opinion per se, it happens with the left too, (as opposed to (neo)liberal), think Sanders crowd being called sexist/bros by the Clinton crowd etc. Basically if it's not a centrist opinion you're out. It's called the overton window.


Now the cat is out of the bag, freedom of speech is just a lofty ideal that only worked when all media was controlled by corporations that chose what people listened to most of the time. Now they are trying to put the cat back in the bag. Good for reiterating corporate control over thought and speech.


I agree, but I think there is also another effect at play here, which is the end of private circles where opinions can be exchanged "present company excluded". If you watch old tv, for example, you'll be appalled at the sort of things (harsh, elitist, dismissive of the general opinion) that could be said even in a broadcasted event- the audience was self-selected, recordings were rare and generally unavailable, few were reading any sort of newspaper. People were entitled to unpopular opinions because they had a limited circulation. But today, when everything is recorded, cut and pasted, infinitely available and replayable, every statement made in public must be crafted to be acceptable to absolutely any audience in any context and at any time. Those who contravene the rule are at terrible risk of having their public, work and private life destroyed by public uproar- effectively resulting in a strict censorship of opinions.


Corporate and state control but I agree completely.


Which is why we need to remove their noise from the system. A real debate of ideas is impossible with you have bad-faith actors masquerading as participants.


Not on HN, looking at your comments/submission stream.


It's mostly on twitter or reddit. As a side-note another thing that bugs the shit out of me is when someone looks at my history to find a reason to disregard my comment. (Not accusing you of this by the way)

"Oh I thought I was going to have to respond to you until I discovered you've posted somewhere I don't like."

followed up by a flood of down-votes accusations of being a bot and sometimes a ban when I was posting legit content and previously with a very positive score.

Happened to me twice last month. Once for bringing up the Code of Conduct on /r/linux.

edit: here's a perfect example that made it into print and halfway around the world while the flawed reasoning and debunking has yet to gain traction.

https://www.theverge.com/2018/10/2/17927696/star-wars-the-la...

followed up by:

https://www.cnet.com/news/actually-half-of-the-last-jedi-hat...

http://monsterhunternation.com/2018/10/02/my-russian-bot-rev...


Spam filtering isn't censorship. It's not censorship when the recipient doesn't want the message. And botspam is spam.

The difference is when you have a message that some of the actual human recipients actually want. Then it isn't spam, it's just information some people disagree with.

If you don't like what someone has to say, don't listen to them. But you have no right to tell someone else they can't.


Are those bots not "behaving civil in the expression of their (masters') beliefs"?

Or do you think they should be censored because they are Russian?

Or because they are bots?


How are we defining what a bot is? From what I can tell a bot seems to be astroturfing done by Russians working for a quasi-propaganda office. But, there are many state sponsored "bots" from all kinds of places. Promoting all kinds of things and political views, from allies to foes.

Or is a bot strictly an "AI" bot?


I think sockpuppets should be moderated, and bots should be too, for the same reason.


> Or because they are bots?

That one.


Yes! I am OK with only humans having a right to free speech.


Humans express their right to free speech through bots. It's no different from expressing your right to free speech through paid advertisements.


I think I am okay with contents from bots if they are marked like advertisements ("sponsored contents", etc.). For example, "post from ABC" could instead say "post from ABC, a bot account of XYZ".


Are you OK with dishonest speech?

If so, then not marking bot speech as such is mere dishonesty.

Not to mention that, in the general case of speech, I do not have to disclose if I am being paid to express my viewpoint.


I am okay with dishonest speech (or paid speech, for that matter), because it affects your reputation. Sockpuppets and bots harm reputation tracking.


Nobody in the real world - outside the ivory tower - actually cares about tracking the reputation of a PAC that was spun up for an election in order to run political ads, and will disappear in two months.

What makes a bot any different?


Not really. There are disclosure rules for paid advertisements. People can express whatever they want through a bot, as long as they also make it clear that this expression is being delivered by a bot.


But if the bots are intelligent enough to be a problem maybe they should have free speech.


Even if their sole purpose isn't any 1 side or argument, but just to create as much chaos, anger, and confusion among the masses as roboticly possible?


You’d be surprised at the kind of biases you can stuff into a “term of service”.


I've investigated claims of Russian-controlled political bots on Twitter several times. The stories always collapse when inspected carefully. Indeed it's common sense it would be this way because most of these stories appear to blithely assume Russia has built bots that can pass the Turing test, which is implausible.

I don't believe there is any need to censor such "bots" because they don't exist. Instead they're the creation of:

• Bad, agenda driven journalism.

• Bad, agenda driven academics who are looking for topics to write papers about that might have social impact.

• A desire to de-humanise and silence people with conservative opinions.

Clear examples of these three are described in my essay here:

https://blog.plan99.net/did-russian-bots-impact-brexit-ad66f...

The third is also quite clearly seen in the recorded conversations with Twitter employees, one of whom said:

https://twitter.com/project_veritas/status/99675939690561945...

"Just go to a random [Trump] tweet, and just look at the followers. They'll all be like, guns, God, 'Merica, like, and with the American flags... like who says that? Who talks like that? It's for sure a bot."

Obviously this Twitter engineer knows full well they aren't bots. What he means is, we ban accounts we think are bots without worrying about it, so if I can create a confusion between "bots" and "conservative Trump supporters" then maybe they'll get kicked off Twitter more easily.


Censored for what?

Being bots? Bots are programmed by humans. Humans have a right to free speech.

Being Russian? Russians are also humans. Humans have a right to free speech.


Bots won't often have a viewpoint. Russia isn't pushing 1 side or the other with these things, but creating mass hatred by taking as extreme a view (left or right) as possible. They're built to pit people against each other, and they're doing a fantastic job. I'll leave it to you to decide if that's valuable or not.


This 100000% over. Otherwise they are becoming the new kingmakers, manipulating people often without them even knowing. Incidentally there is a good documentary that is now on Amazon Prime called "The Creepy Line" that is worth watching.


>The best stance is not to take a side, but to make sure both sides are civil in the expression of their beliefs.

I agree with that. It would help if these companies stopped seeking "virality" and "eyeballs" or whatever metaphor you want to use of user engagement and "notoriety/reward".

Otherwise, this is exactly the same argument China's (or Russia's) censors would make. Absolutely not different in any way.


What both sides though? The alt-right isn’t in opposition to anything mainstream, they are anti-democratic and should be treated the same way antifa, farc, isis and others are.

Every time democracy fails to shut down movements like the alt-right, things end badly for democracy itself. The most used throve is stuff like nazism and fascism, but history actually have a lot of better examples.

Because both Napoleon and Caesar effectively ended democracy with applause, unlike Hitler.

I think it’s extremely dangerous to treat anti-democratic forces as equal to democratic ones. I’m a conservative by the way, so it’s not like I’m not conserved about the liberal bias in the tech sector. But the debates we have these days, about forcing platforms to include outright anti-democratic values is crazy.


The problem is that then you have to decide what's 'anti-democratic'. You won't find 50 opinions that differ from your own. On top of that, it can then be used as justification to censor almost anything you don't like.

Sure, right now you'll think there is a clear criteria. But in 25, 50, or 100 years from now, people could have easily twisted it to mean whatever they disagree with.


This. In some ways, we sort of see that with the PC movement itself. Like when people attacked Scott Kelly for quoting Winston Churchill, the OG anti-fascist, who opposed real life fascists, for God's sake: https://www.bbc.com/news/world-us-canada-45789819


Why aren't those criticisms valid? Scott Kelly seems to think they were.

Edit: quoting him to make this clear: I did not mean to offend by quoting Churchill. My apologies I will go and educate myself on his atrocities, racist views which I do not support.... It's a tweet embedded in the same article.

People's view of leaders change over time. In Australia Churchill was seen as an evil colonialist for his role in the Gallipoli debacle until he was rehabilitated during WW2.


I don’t decide what’s anti-democratic though, we have guidelines for that provided by the United Nations. I mean, it’s been seen and done before throughout history, so it’s not exactly hard to spot.

Attacking the free press is one criteria.

Labeling everyone that disagrees with you the slightest as enemies, is another.

I mean, assuming this leak is real, then googles stance on censoring is extremely centrists, letting them appease both sides of the political spectrum. But that’s not what breitbart or the alt-right really wants, they want google to only appease their radical views.

Ironically, it’s a lot easier to get banned from alt-right forums than it is from YouTube. All you have to do, is disagree with whatever dogma they spin, because they aren’t even remotely interested in a democratic discourse.

Of course, looking at history, no one have ever really stopped movements like the alt-right early enough to save their democracy. So America is probably rather doomed.


> Attacking the free press is one criteria.

> Labeling everyone that disagrees with you the slightest as enemies, is another.

You do realise that you're

- attacking the free press by asking for censorship?

- Sort of labelling other people that disagree with you (in this instance: disagree with democracy) as enemies (of democracy)?

That's of course very inconsistent.

But it also assumes that "the current form of democracy is the best we can ever have".

Democracy needs criticism, especially since what we have is a 19th century system that assumes information dispersal and real-time voting is practically is impossible.


Well, yes. But I don’t think censoring the alt-right is problematic if they can’t stay within whatever policies companies set. You’re not denied access to a supermarket either, but if you start intentionally pissing in their floor, then you’d get thrown out.

Of course I come from a region of the world, where people like the alt-right won, and eventually started putting centrists in prison camps.


You seem to be very confident that you can so accurately judge whether someone is "alt-right"(a nebulous label, at best) that you can censor them preemptively. Where does this confidence come from?

Further, does espousing an anti-democratic idea make someone alt-right? How sticky is the label? What if you did it 10 years ago? Especially with how much of our lives we record nowadays, an accusation like this becomes an easy-to-wield cudgel to shut down political opponents. This rapidly leads to a race of gotchas, where we look for anything that lets us cram someone into one of the "bad" labeled boxes(racist, sexist, alt-right).

Lastly, how effective is the censorship you are proscribing? Can you achieve total censorship within the scope of a democracy? Does it actually inhibit the spread of the ideas you loathe, or simply put them out of your sight? What about the radicalization you are causing by censoring these people? You haven't convinced them to stop, you've just muzzled them publicly, but they can still create private clubs and gatherings. What problem have we solved after implementing this censorship?


The free press in Germany seems to be doing ok, even though they aren't allowed to print hate speech.


Just because a belief is civilly expressed does not make it less harmful. For example: people casually believing that the Sandy Hook shooting was a conspiracy drummed up by the government is actively harmful to the families affected by that tragedy. What do you say to those people? How do you solve that problem? And how do you define 'civil'?

Because different platforms have different ranges of civility as well.


There are thousands of conspiracies floating around. Should we consider them all harmful and ban them?

Who decides when a conspiracy becomes harmful? Should we ban flat-earthers from posting their views? After all, if everyone believed it, we would set science back countless centuries.

The way to combat false ideas is with truth; a culture that is always searching for the truth. That can only be facilitated by unhindered speech.


>>The way to combat false ideas is with truth

This is extremely naive. As the saying goes, a lie can go around the world before the truth can finish putting on its shoes.

edit: maybe rather than downvoting (which is a type of censorship) y’all should provide a response.


I don’t think bad feelings count for “actively harmful”. Also, you can always just not read the internet to avoid bad feelings.

NSA surveillance was a conspiracy theory before Snowden, so I don’t think the “conspiracy theory” accusation warrants censorship.


I don't think citing a random saying is a great way to rebut an argument, let alone support a claim of naivety.

Here's another saying for you: you can fool some people all the time, and all the people some of the time, but you can't fool all the people all the time. In other words, eventually the truth wins out.

I guess our sayings cancel out.


Do you consider American culture to be one of the more free ones? If so, then why do these conspiracies exist? Why have flat-earth conspiracies and anti-scientific dogma thrived? Why do we have a president that outright denies scientific fact?

If combating false ideas with truth worked, then surely these problems shouldn't exist in America, the land of the free. In the arena of free speech, only the values with the most truth should thrive. Yet that's clearly not the case.

This is the paradox of tolerance. We can say that unhindered speech is the way of countering harmful speech, but then I ask you: What led to the rise of such harmful speech today?


> If so, then why do these conspiracies exist?

Because that's how the process works. Liars lie and evidence proves them wrong. It doesn't make them disappear from existence, it just makes it so you can discover the truth.

Without free speech there are no fewer lies, all you do is suppress the truth. The Party's version is the only version even when it's fiction.


Depends, these conspiracies can and do actively cause harm in the interim though (climate change, anti-vaxxing, pizzagate, and arguably others).

Is liars being about to lie worth a body count?


> Is liars being about to lie worth a body count?

The body count of truth tellers not being able to tell the truth is dramatically higher.

Imagine the result if Richard Nixon had put climate scientists in the same box as you would put anti-vaxxers.


I don't really buy "unrestricted free speech is worth letting people die over". Sure, censorship can be harmful, but if the best argument your can put forward is a slippery slope, thats not particularly compelling.

Hypothetical harms are generally less harmful than real ones.


> I don't really buy "unrestricted free speech is worth letting people die over".

I don't really buy "restricted speech is worth letting people die over".

> Sure, censorship can be harmful, but if the best argument your can put forward is a slippery slope, thats not particularly compelling.

What slippery slope? Censoring climate scientists is the first thing politicians in the pocket of oil companies would do with censorship powers.

> Hypothetical harms are generally less harmful than real ones.

The actual harms of censorship are widespread and well-documented. If you want to go straight for the serious examples, look at The Great Leap Forward, Stalin's purges or the holocaust. In each case there were not only direct large scale executions of political dissenters, the censorship allowed other atrocities to be kept quiet. Much of the scale and inhumanity of the holocaust wasn't discovered until the end of the war as a result of Nazi censorship, and keeping it quiet allowed it to continue for longer with less opposition both domestically and internationally. For example, the US could have entered the war earlier.

But it also goes all the way down to pedestrian squabbles where people disagree over matters of life and death at smaller scale (e.g. the safety of a building or a bridge). Censoring true facts has literally been fatal countless times.


>I don't really buy "restricted speech is worth letting people die over".

Only one of our positions is hypothetical.

>What slippery slope? Censoring climate scientists is the first thing politicians in the pocket of oil companies would do with censorship powers.

Says you. You're running under the (faulty) assumption that such censorship would be unregulated. I also wouldn't support such a system, but thankfully that's not what I'd propose either.

Yes, the actual harms of unrestricted censorship are widespread and well documented. But most of the EU censors things that are free to say in the US (antisemitism, as an example), and yet somehow manage to rate higher on independent ratings of press freedom[0].

You're argument reduces to "unregulated power is bad". Yes. Agreed, completely. Unregulated power is bad.

>Much of the scale and inhumanity of the holocaust wasn't discovered until the end of the war as a result of Nazi censorship, and keeping it quiet allowed it to continue for longer with less opposition both domestically and internationally. For example, the US could have entered the war earlier.

This is some strong historical revisionism. No, the US just took a strong turn toward isolationism post WWI and the great depression. The atrocities of the holocaust were not some perfectly kept secret. I'd encourage you to read up on US foreign policy in the 30s, as well as general sentiment among the population [1]. Among other things, antisemitism, worry about another economic downturn, and a strong isolationist ideal underlined by the opinion that "We shouldn't send American boys to die solving a European problem"[2] were the main reasons the US didn't enter the war.

Even when they did, it wasn't out of a sense of civic duty to save the Jews, it was because "oh crap, Germany could actually threaten the US and the world".

Allied governments absolutely knew about the crimes the Nazis were committing, they just didn't care. The population didn't really care either.

Immediately jumping to "the Nazis and Stalin censored people, so censorship is bad" isn't an argument, its fearmongering, especially when the claimed impacts of censorship aren't really true.

[0]: https://rsf.org/en/ranking# [1]: https://encyclopedia.ushmm.org/content/en/article/the-united... [2]: And honestly, given the horrors of WWI and trench warfare, I can understand this attitude.


> Only one of our positions is hypothetical.

Then it's yours, because there are many documented cases of fatalities occurring as a result of the suppression of inconvenient facts.

> Says you.

Says the Union of Concerned Scientists.

https://www.ucsusa.org/our-work/center-science-and-democracy...

> You're running under the (faulty) assumption that such censorship would be unregulated. I also wouldn't support such a system, but thankfully that's not what I'd propose either.

It's impossible to actually regulate censorship because for regulations to be sound they have to be vigorously debated, but the public can't, by definition, debate whether something should be censored if nobody can talk about it because it's being censored.

> Yes, the actual harms of unrestricted censorship are widespread and well documented. But most of the EU censors things that are free to say in the US (antisemitism, as an example), and yet somehow manage to rate higher on independent ratings of press freedom[0].

In-spite-of, not because-of. And the US scores poorly largely because of this new radicalized censorship where crazy people are now committing acts of violence against journalists that publish stories they disagree with, and because of all the abuse of power (arresting journalists on charges that won't stick as retaliation for undesired coverage/investigating). Succumbing to populist censorial sentiment or creating new opportunities for more of that abuse obviously wouldn't help matters.

> No, the US just took a strong turn toward isolationism post WWI and the great depression.

Those were the reasons they didn't enter the war when it just seemed to be a war. Knowing what was actually happening could have overcome that sooner, or at a minimum spurred people to do more to facilitate the escape of Jews from the affected countries.

> The atrocities of the holocaust were not some perfectly kept secret.

Their full scope was not publicly known until near the end. We recently learned that the government knew earlier:

https://www.independent.co.uk/news/world/world-history/holoc...

But even that was the year after the US entered the war.

> Immediately jumping to "the Nazis and Stalin censored people, so censorship is bad" isn't an argument

"The Nazis and Stalin censored people and they were bad so censorship is bad" is not an argument because it applies equally to building roads or using radios.

"The Nazis and Stalin censored people and as a direct result of the censorship more people died" is a cautionary tale and a strong indictment of censorship.


I lost my long and well sourced post responding to this, so I'll be brief(er).

To begin, the atrocities of the "Holocaust" as its commonly referred to, as the large scale extermination of the Jews, didn't really begin until after the US entered the war. Before then, knowledge of the the general plight of Jews in Germany and Europe were pretty much common knowledge. So your claim that broader knowledge of the Holocaust might have caused the US to enter the war earlier has a flaw: There was, at the time, not yet a Holocaust for people to be aware of.

As for everything else, I'm actually perplexed by portions of your argument, because you seem to be arguing that the government should not censor people, and that populist censorship via speech is bad. It seems pretty clear to me that if the US outlawed threatening journalists, it would lead to an improvement in press freedom, and from your comments, it sounds like you would agree with this, since its a form of "radicalized censorship". But I think you would also argue that such a law was itself censorship and unethical. This is especially true since you conflate censorship and suppression.

Censorship is but one form of suppression. I can suppress an idea without censoring it by generating so much nonsense that an observer can't readily discern between fact and fiction. You clearly have an objection to the suppression of facts, and I would agree that that's not a good thing.

But where I disagree with you is that censorship necessitates the suppression of facts. In fact I think often, well "aimed" censorship can improve discourse and prevent the suppression of ideas.

>It's impossible to actually regulate censorship because for regulations to be sound they have to be vigorously debated, but the public can't, by definition, debate whether something should be censored if nobody can talk about it because it's being censored.

I do also want to call this out specifically. We are, right now, albeit by proxy, vigorously debating the right to support Naziism. Yet, I don't see any reason for an observer to believe that either you or I ourselves supports Naziism. Censoring "support of Nazis" does not necessitate censoring "support of the right to express support of Nazis".


If by 'harmful' speech you mean conspiracies, I would argue they've always been around. It's not a new problem. People have always filled the gaps in their knowledge with unjustified beliefs.

However, your answer would be to deplatform those who would post such ideas. Do you honestly think that would make it better? Did deplatforming Alex Jones help people? It just galvanized those who believed there was a conspiracy and made him all the more popular.

How about we take on the hard task of trying to actually reach those who we believe have false ideas. For example, how do people respond to flat-earthers? Constant ridicule and laughter. The reason they believe in these theories is a lot more complex and nuanced than merely that they are 'stupid'.


Deplatforming Alex Jones gave him a slight bump in popularity after it occurred, followed by it falling off a cliff shortly after.

Also in your example of responding to flat-earthers what other alternative is there when they refuse to listen to reason? When you can't engage with them on any level?

Why is it my responsibility for example to be the one to engage with people who might view me as being a lesser human being due to my skin color or sexual preferences?


Could that have been because the platform people would use to see his new content was no longer able to be used for this purpose, and people have been cordoned into walled gardens so long the general public is no longer aware of alternatives?


Deplatforming Alex Jones has helped enormously.

One of the Sandy Hook parents wrote how she is able to check her mail now without getting multiple threats in the mail every single day.


This is the paradox of tolerance. We can say that unhindered speech is the way of countering harmful speech, but then I ask you: What led to the rise of such harmful speech today?

You point at anti-science conspiracy theories as a failing of free speech, and you're probably right, but what would you have us do otherwise? A world in which some central authority dictates that certain opinions are never allowed to be expressed, to me, seems like an absolutely horrible place to live, even if it meant that antivaxxers couldn't spread their crap anymore.

Furthermore, there is no such thing as a perfect political system. It's not enough to point at what free speech fails at, you also need to suggest what we should be doing instead, and how the failings of that system wouldn't be worse.

It's a pretty good comparison to democracy, in my mind. It's the worst system ever invented, except for all of the other ones we've tried.


You live in that world right now. We have a central authority that says 'you're not allowed to call black people certain words'.

I always ask this but do you believe that was a mistake? Do you think people have that specific right to exercise their freedom of speech?

Obviously there are degrees of harm to speech. We've already collectively determined this as a society. And to me, anti-vaxxers are a great example. We arrest anti-vaxxers if their children suffer due to child abuse. Do you believe that to be a mistake, that they don't have the right to exercise their own beliefs?


What central authority would that be? Because I'm pretty sure I can find speech along those line on the top five sites on the internet with little effort.

I always ask this but do you believe that was a mistake? Do you think people have that specific right to exercise their freedom of speech?

I think I'm unclear on what you're asking, but as an American who sees a great deal of value in that approach, and seeing the fruits of the opposite approach (my go-to example being the UK, where libel laws are significantly more open for abuse), then, yes, I do think people have that specific right - on the level of human rights, even.

Do you believe that to be a mistake, that they don't have the right to exercise their own beliefs?

That is a different question - speech and action are two different things.


> you're not allowed to call black people certain words'

It’s actually much worse, the restriction itself is highly racist; only blacks are allowed to use the “n-word”, while whites (and other races I presume) aren’t.


The true reality is that the most persuasive ideas survive, not the most truthful ones. The challenge for the scientific community is to make sure to remember that persuasion matters as much as logic and facts do, if not more.


Politicians always play with the truth. E.g. Obama repeated the “pay gap” lie because it served his political agenda.


This is actually an old problem. The value of free speech as a utilitarian goal, as opposed to being an absolute right which even the American system doesn't quite believe, is in having an unfettered marketplace of ideas. But if all ideas are equally valuable then there is no apparent value to truth and no consensus. You certainly wouldn't design a network protocol like that where nothing is allowed to converge. This sort of thing is glossed over in the regime of approximate consensus which holds most of the time in most communities, but when things get really divisive and intractable, or when there are deliberate adversarial attacks, I don't think there is a consistent philosophical resolution.


> Yet that's clearly not the case.

Ugh...yes, it is. Our people are just too stupid to find the dissenting opinions.


It's not the belief that is harmful per se. It would be the harassment of those affected that is harmful. To give an alternative conspiracy that was given a lot of oxygen, many believed that Bush caused 9/11. Conspiracies themselves are often wrong, but occasionally correct. I don't believe that one was correct, but for example, Edward Snowden's NSA revelations were initially derided as conspiracy. It's important that even those types of ideas be allowed to compete in the marketplace of ideas.

The problem I see isn't so much that people harbor conspiracies, but that they do so without interacting with the world at large. They do so in a bubble, and in a bubble, a conspiracy can be amplified. This is how extremism develops.

The core problem is that people aren't interacting with enough different types of people in order to moderate their own behavior.


Also, the term you're looking for is "conspiracy theories", not "conspiracies". Common misconception for some reason.


...would you believe Bush didn't cause, but closer to "accidentally allowed", 9/11?


> What do you say to those people? How do you solve that problem?

That's not a problem that needs to be "solved." It's not necessary to force everyone to think alike, or believe the same things. It's not necessary - or desirable - to try to force everyone to never think/believe mean things or unhappy thoughts.

It's similar to saying: how do you solve people that believe in ghosts or god or unicorns (particularly if you're an atheist and believe that's all bunk). Consider hardcore religious people that believe most people are going to burn in hell, and or they believe that most dead people are in hell. Now ask the same question: isn't that offensive to those people that are going to supposedly burn? Isn't it offensive to the memory of the dead? Solve what, that people are allowed to believe what they want to (including offensive things)? The censorship it would require to constrain all of it would be horrific, authoritarian to the nth degree.

The world didn't end because of all the 9/11 conspiracy theories. Thousands of people died in those attacks. Yet the conspiracy theorists proclaimed that a plane never hit the Pentagon, that there were no people on the plane that crashed in PA, that all the Jews stayed home that day, and so on. That doesn't need solving, people will never stop believing and saying crazy things, you can't force it to end.

As longer duration examples, JFK and the moon landing conspiracies persist five decades on, and will never cease to exist. If NASA couldn't solve that with 50 years of education and the extraordinary amount of evidence they've provided, what hope is there for the other items.


Okay, now let's apply your example to something a bit more violent. People that believe black people are sub-human and deserve less rights.

At what point do we no longer tolerate their opinions? For example, would you believe it's bad for someone to be censored on HN because they talked about how much they hated black coders? At a certain point we deemed as a society that those sort of views are considered bad. Do you think that was a mistake?


Who decides where the line is crossed though? People believe that children deserve fewer rights as well. We all accept that. What about fewer rights for prisoners? What about fewer rights for non-citizens?

These people literally have fewer rights and we don't consider banning discussion about giving them more/taking more away.

>At what point do we no longer tolerate their opinions? For example, would you believe it's bad for someone to be censored on HN because they talked about how much they hated black coders?

False dichotomy. Nobody has suggested that free speech means that every community has to tolerate these opinions. It just means that people can't be persecuted by the government for expressing them. It's fine to ban racist comments on whatever site you run.


It's not a false dichotomy considering this entire discussion revolves around Google, specifically, censoring people. Now there is an argument to be made that Google being potentially a monopoly should be held to more stringent standards but that is not in my opinion the crux of the argument here.

Also you're essentially advocating for the position that we should be allowed to discuss whether or not black people deserve less rights here.


Nobody has to tolerate such opinions.

Facebook, Twitter, etc. have block / unfriend options. It's extremely easy to avoid sites like Daily Stormer (or whatever the infamous white supremacist site is called?).

I intentionally don't hang out with racists, bigots, etc. It's the exact same manner of avoidance. You have to do it in real life, I don't see why the same premise wouldn't hold online. Don't spend time in such places, don't invest into such people on social media. You can't force racists to not be racist, you can choose not to associate with them.

> For example, would you believe it's bad for someone to be censored on HN because they talked about how much they hated black coders?

HN isn't a monpolist platform (such that it can heavily restrict information distribution across an entire nation or more), as is the case with Google and Facebook. HN strictly policing its own site in the manner it sees fit, doesn't qualify as censorship, in my opinion. When a monopoly platform does it, it is censorship (because the condition of it being a monopoly means there are limited or no alternatives).

> At a certain point we deemed as a society that those sort of views are considered bad. Do you think that was a mistake?

Not at all. There is widely a societal punishment for such terrible views: you become an outcast in many regards, particularly among the vast majority of people. That society deems such views terrible, does not simultaneously require they be banished / made illegal / censored from all publication (whether books or social media).

That cultural battle should in fact occur out in the open. There is no better platform for it than that. You can't nearly so well combat terrible ideas if they're not expressed.

When it comes to how a monopoly like Facebook should deal with it (in a context where one believes all people have a right to be able to use it): they should delete illegal posts (underage pornography as one example), and they should perhaps restrict blatantly offensive content to adult readers only (18+ in the US), and require an acknowledgement to view it. There are a few directions that Facebook could go with dealing with Alex Jones types for example, that doesn't involve a heavy handed censorship or ban. Limit the mass distribution of their content, as we might in a public square. We don't generally block people from having one to one, or one to few, offensive conversations in a public square, assuming it's discrete. We do generally stop them from mass distribution to all by shouting offensive things through a large audio setup or similar in public squares. It's an effective means of not denying someone access to public squares (eg banning them from Facebook), while still not turning a platform into an open amplification vehicle for terrible, abusive, offensive ideas.

What we're all really talking about here is: should people be censored from being able to offend other people. That central concept is what connects all of these varied speech discussions covering race, religion, sex, gender, political ideology, et al. Should offensiveness be banned? It can't be done, the result of trying to do it is entirely predictable ahead of time. We're seeing countries like Britain attempt it, it's a grotesque absurdity in result. Since what's offensive is inherently subjective and will always vary from one person to the next, it becomes a system of who has power to dictate from moment to moment what's offensive. It becomes a competition of an ever tightening restriction, as each power group adds items to the list (culminating in authoritarianism as freedom of speech essentially entirely disappears). The quest to banish offensiveness ('the right to not be offended'), is a fool's errand at best, and an authoritarian's dream at worst.


If you want to be a member of many of these platforms, and you're a target of that offensive material, you do have to tolerate it. The racists use the platforms tools to find you and hurl it at you, and then the platform gives you little to no recourse to even ignore them outside of not using the platform.

Your advice only works when you're not the target - when you just need to walk past and avert your eyes from what's being said to someone else.


HN is doing “filtering” more than “censorship” - there’s a setting that lets you see flagged comments (greyed out), so the filtering is almost completely transparent, it cannot really be called censorship.


This is a valid criticism. Let's say something truly unsavory was said about a valued relative, etc. It's hurtful. But, I think it's less hurtful then not being able to have dissident ideas expressed. For examples, we have Russia and China. They will censor topics using "civility" and whatever else in their arsenal to crush dissent. Personally I think that is worse.


Well, it usually starts with a genuine desire for civility, but as they say, power corrupts and once you give power over speech to censors (and you see this even on moderated boards), they tend to abuse their power for other purposes. It's not a problem because of repressive societies, they are the symptoms. It's a problem with human nature.


And yet, with zero moderation boards also very quickly turn into fetid cesspools. One just has to take a look at Voat to see what I mean.

Moderation, like many things, has to be taken in moderation.


Gives new meaning to the old expression "Moderation in all things, including moderation."


[flagged]


Nobody has said anything about hurt feelings. They are speaking about the families of children killed at Sandy Hook, and the maleficent antagonists who harass them with lies.


So what’s the harm to the families, except bad feelings? Was anyone actually (physically) hurt?


Is being physically hurt the bar now? How about living in fear and the extreme inconvenience with moving to different residences out of fear?

Also, for example:

>In December 2016, Lucy Richards, a 57-year-old woman from Tampa, was charged with four counts of transmitting threats in interstate commerce for sending death threats to Lenny Pozner, whose son Noah was the youngest of 20 children murdered

Imagine losing your child and then having to put up with people like that who are fueled by someone trying to get rich off low information people. We as a society can and should do better.


You know, it feels bad, to get physically hurt. To imply, the only damage, that can exist, would be of material nature (here: the meat of your body) gives me the shivers.


Yeah I agree. But the crucial difference is, physical/meat damage is objective, whereas emotional pain is only subjective (you can't prove it to anyone else, and you can also make it up). Basing policy on subjective whims is bad.


The "reasonable person" is a well-established principle in law, used to test whether someone's subjective claims of harm should be considered valid from an objective viewpoint.

Few would question that a "reasonable person" whose child was killed in a mass murder would be harmed by harassment, abuse, and claims the event didn't happen.

I'm not normally any kind of bleeding heart, but stuff like this isn't difficult.


> Basing policy on subjective whims is bad.

True, but not enough for me. Though, I do not have the solution.


You can and absolutely should take a side.


> There are people who believe free speech is not the ultimate value

A right to free speech is a recent, American-centric invention, rather than a natural cornerstone of democracy, a fact that often seems lost on Americans.


It seems many historians do not support your view. Notions of free speech date back to antiquity. [0]

[0] https://en.wikipedia.org/wiki/Freedom_of_speech


399BC Socrates speaks to jury at his trial: 'If you offered to let me off this time on condition I am not any longer to speak my mind... I should say to you, "Men of Athens, I shall obey the Gods rather than you."'


The enumeration of free speech as a natural right in a Constitution is an American invention, but the idea itself is much older than that. However, the freedom of speech and press is as important to free nations as an immune system is to a body...neither will lost long without it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: