Open source encryption has been available for decades. Yet, whatsapp got a billion users before they enabled end-to-end encryption.
The data shows you're wrong: lack of encryption does not prevent a significant number of people from using a product.
I'm mostly repeating what I said in https://news.ycombinator.com/item?id=10580829. Further down I had several links supporting the claim that even people who have a lot to gain from encryption will often not use it if it's not enabled by default.
+1. Most users don't care about encryption. They don't care about privacy. Hell, they don't even care about democracy, as long as it doesn't have any bad effect directly applied to their life right now.
They start cary about privacy if a picture of them naked is exposed or if somebody "pirate" their social accounts, then they will ask for someone to blame, and then go back to business as usual.
You can't expect the average population to be proactive on this.
It's not that they don't care, it's that they are asked to give up a lot for fairly vague risks. I have friends who don't use social platforms and go trough the effort to use encryption and deploy their own systems. They loose more freedom not expressing themselves than they gain, especially since intelligence agencies can still target you. Not only can they correlate things with all your friends that use social platforms, they will get much more targeted profile from the data you inadvertently leak.
I don't think technologists who ask users to do complicated things or blame them when things go wrong care about privacy Especially when large tech companies make bank by not standardizing things that should be fundamental rights in communication. To me a lot of this (but not everything) is banker level arrogance similar to pension funds or housing markets where "people should have known we were selling them crap".
I agree the trade off is not in favor of privacy and tech saavy users tend to live far away from the regular users needs.
But it's also that people value most things over privacy. To me, valuing the hability to express yourself on facebook over freedom is not a good bet. Just like valuing the hability to smoke over your health is not a good one.
I do understand why people do it. I have my share of similar self-destructive behaviours.
Still it's commun that important things like ecology, freedom, health are discarded over confort, convenience and quick rewards.
> They start [caring] about privacy if a picture of them naked is exposed or if somebody "pirate" their social accounts, then they will ask for someone to blame, and then go back to business as usual.
This isn't just a consumer encryption problem, this is the infosec community's fight in a nutshell. To put it simply: risk is invisible.
This is why projects such as the EFF's scorecard ( https://www.eff.org/secure-messaging-scorecard ) are incredibly important, possibly more-so than the actual code various applications implement. Because without public awareness there is no incentive for companies to implement non-trivial encryption features.
To look at it another way, we're fighting a public health battle here and we should borrow the lessons they've learned. You don't eradicate Dracunculiasis by assigning a doctor to look over every person's shoulder -- you educate people that {behavior} causes {invisible thing} causes {problem}. And once people know why, they're empowered to help themselves.
I agree. Although this generally cause another problem: people start reacting over fear or habits, not because they become conscious of the problem. Which means:
- if the problem take another form, they won't act;
- if the solution needs to be adapted, they won't do it;
- when the problem doesn't exist anymore, or your understanding of the problem is very different and you realize you fought the wrong battle, it will be impossible to realocate the ressource to it. Worst, you will get discriminated if you do.
All in all, education is good, but as long as people don't generally care about the way the world works, no matter the specifi subject, it will be fighting a never ending battle.
Actually, we have an example of this happening with messenging apps. No one used to care about end to end encryption, until Telegram and Signal became popular and WhatsApp followed up.
The result is that today, consumers demand it in any messenging app.
No one cared about encryption even when WhatsApp rolled out their E2E, most users still don't understand or know what it is.
WhatsApp decided to roll it out but and it's good for them, but I personally think it was a combination of an internal ideology as well as the ability to spin it off and sell it as hey look at this new feature.
And of course it offers them protection from future leaks as well as gives them something to leverage if the Intercept publishes another snowden leak this time aimed at WhatsApp specifically and say hey we knew about this, we fixed it, don't worry keep using our app!
I don't quite understand your argument. You're saying they didn't do it to satisfy tech savvy users, but they did it because they might need to satisfy those users when they're hit up about it in the future?
There are plenty of us that do care about it. Sure, they didn't have to do it, they still would have had plenty of users - but they chose to do it.
The argument is that knowledge and understanding of encryption is limited to a very small audience even amongst "tech savvy" users.
Even if you only take software developers the vast majority of them does not understands or "cares" about encryption, it's just the hard truth of the matter.
WhatsApp did it because they could do it in a way that was beneficial to them coupled quite likely with internal ideological reasons to roll it out.
It's a positive spin that can be spun in the tech and non-tech media and could be coupled to improve their reputation as long as it's being run with the overall privacy / encryption debate stories.
It also future proofs their platform that when they do come under direct fire they don't have to be seen as reactive.
I don't think consumers demand it. Whatsapp became popular before it got encryption, and I don't see any evidence that people are switching in large numbers based on this feature.
I think whatsapp is rolling up encryption to avoid a new scandal later, that locally creates bad PR, not because it is a feature requested by users.
To have a good grasp of what matters to the public, remember that Apple once did a whole ad campaing over the fact that the new iPhone could copy/paste. We mocked it. And the joke is on us, cause it worked briantly. This is what people cares about, and more over, this is what people knows about.
Which returns to the top commenter's point: Brennan is well aware that this won't end usage of newly-unencrypted US products. It will drive away foreign companies, and those with something to hide, but it will still expand CIA access to the bulk of everyday, non-criminal users.
My point is that "the terrorists/people with something to hide will switch" is incorrect, and so the claim that this isn't about catching them is unsupported. Data shows that even terrorists or other people committing high profile crimes won't always make sure to use encryption. It's very possible preventing encryption by default will help catch a couple of those.
It's very possible preventing encryption by default will help catch a couple of those.
I doubt that. I agree with you that many use unencrypted products though, so you have a point. However in almost all events I can remember the people committing terrorist acts where already known to at least some agencies. The agencies already drown in false positives. There is no indication that this will change. This strengthens my point: it is not about terrorists.
Notably, the Orlando shooter was even more known to security agencies than usual. He had been investigated twice and 'cleared'. After that, he was turned away from a gun store before the shooting for suspicious requests, which led to a tip the FBI didn't follow up on! Somehow, James Comey spun that as "we did everything we could".
If even that level of prior knowledge is being ignored to push for yet more data (and more false positives) it becomes really hard to believe that this request is being made on the level.
Don't forget the time that the Russians(!) put the name of two brothers in the Boston area on a list of "people we think are up to no good" and sent that list to one of the three letter agencies.
Once you have an active lead, investigation is easier if they aren't using encryption. You're focusing on the successes, which necessarily are the cases where the agencies failed.
WhatsApp has made encryption easy, easy to use and easy to understand. Just the day before my mother received a notification saying her WhatsApp messages were now encrypted. And she understood right away.
Sending a GnuPG encrypted email on the other hand is way too much to ask for the layperson.
Getting the message sent and successfully received is of more importance. It's better if its also encrypted. It's best if the messaging and encryption work together, seamlessly.
The data shows you're wrong: lack of encryption does not prevent a significant number of people from using a product.
I'm mostly repeating what I said in https://news.ycombinator.com/item?id=10580829. Further down I had several links supporting the claim that even people who have a lot to gain from encryption will often not use it if it's not enabled by default.