I'm not sure there is evidence for or against. The argument I would make is not about censorship but preventing people from spreading lies. Even in the US where people hold free speech as some amazingly virtuous thing there are still curbs on speech, and what I'm saying is these limitations can be enforced in WhatsApp. Maybe you think that is censorship maybe you don't.
Also on that point, the freedom of speech in the US is strongly correlated with conspiracy theories, if only because 1) you can freely spread malicious lies and 2) it can be in a lot of people's interests to do so. I think it's always been in people's interest to spread lies, but I think recently in the very partisan climate in the US the media (and I'm including social media there) have happened upon the discovery that spreading this misinformation is actually very very good for business. It generates outrage in the opposite camp and keeps one side of the divide entertained.
> The argument I would make is not about censorship but preventing people from spreading lies.
Limiting speech is by definition censorship. There's no value judgement required as to the quality or veracity of that speech to determine whether or not restricting it counts as censorship.
> the freedom of speech in the US is strongly correlated with conspiracy theories
...and water is highly correlated with drowning. We don't say 'water bad, less water'. Just because one thing is a prerequisite for something else doesn't mean it is responsible for that other thing. We have to look to, like you point out, the political climate, as well as the lack of trust in institutions, the education system, and many other factors. Taking a complex issue around how information is shared at an unprecedented speed and scale and saying "just ban it" is, I think, a shallow assessment.
But discouraging the spreading of a message at each spreader, for example by letting the spreader know the message is tagged as misleading by some large number of people before they spread it further is not censorship.
That may limit the rate, intensity, and real-world effect of some kinds of information, without fundamentally limiting the right to free speech.
>censorship but preventing people from spreading lies. Even in the US where people hold free speech as some amazingly virtuous thing there are still curbs on speech
>I'm saying is these limitations can be enforced in WhatsApp.
1) Are you speaking of constitutional curbs on speech (because there aren't many)? Or are you talking about curbing speech that is believed to yield negative outcomes?
2) How, precisely, could these be enforced? I ask because automated moderation at scale is impossible to do responsibly and consistently. This will not change anytime soon.
3) Gates seems to be demonizing encryption. Few options can logically follow that position. Either Gates advocates replacing beneficial (to everyone) encryption with vulnerable (to everyone) encryption or he advocates banning encryption outright. It's difficult to see how either of these outcomes would benefit anyone (other than authoritarian governments and similarly repressive interests).
Even if Gates got his way on #3 (make everyone more vulnerable), #2 remains technically impossible.
Cancel culture probably should be #2, it's debatable if it even scratches the surface of achieving #2. It seems cancel culture is not interested in people being dishonest, or spreading misinformation intentionally and maliciously, it's more concerned with enforcing new social mores against dredged up old statements/actions. That could be a good thing, in and of itself, but it doesn't help solve the spread of misinformation and conspiracy theories.
Well, for starters (given two-way communication) you can ask them for the sources of their claims, or at least demand that they defend the logic behind it. Then you can take it from there, and unravel the fallacies for them. Or if you want to be more polite, then carefully point the fallacies out for them and ask what they think about it themselves, when looking more closely at it. This would be the pedagogic approach, for those of you who have the patience.
And if it all falls apart (like it sometimes does when people are faced with their own failure), then you can at least ask them to give arguments instead of ad homs. I always try to be polite the first time around, but if they double down, I let them have it. But really, if it ever falls that low, it usually means that you already won, and so you don't really need to bother any further with the discussion.
My thinking is that most people who read such discussions can think for themselves, and so giving good arguments, unraveling faulty logic, and showing the truth, will always let truth and logic prevail in the end.
Usually you will never get a person to admit that he's wrong anyway, at least not to your face, so don't even worry about it! But people do change their minds about things. It usually happens in private, especially if they just facepalmed right into their own flawed logic. So if they do, never gloat and pretend like nothing happened, and rather commend them for telling the truth later on.
Perhaps I'm naïve, but I enjoy staying positive like that. :)
Mere faceless fake news and conspiracy theories, however, simply needs highlighting and debunking. There are several sites that specialize in that already, with various success. It's not perfect, but IMHO it's preferrable to outright censorship. Because who can be the final arbiter of truth anyway...
Also on that point, the freedom of speech in the US is strongly correlated with conspiracy theories, if only because 1) you can freely spread malicious lies and 2) it can be in a lot of people's interests to do so. I think it's always been in people's interest to spread lies, but I think recently in the very partisan climate in the US the media (and I'm including social media there) have happened upon the discovery that spreading this misinformation is actually very very good for business. It generates outrage in the opposite camp and keeps one side of the divide entertained.