Hacker News new | past | comments | ask | show | jobs | submit login

The "green padlock" was not considered enough because users would not be able to differentiate it from a big lock symbol within the page. Thus we got HSTS.

(There was a time when browsers would color the entire URL bar yellow to indicate https, but that went out of favor many years ago.)

Moxie deserves respect for the web vulnerabilities he discovered and raised awareness about years ago, and for his general competence at cryptography. But in recent years he's shown himself to be willing to make catastrophic sacrifices to make security applications popular and viable for the "lay person".

If the "lay person" ignores the non-obtrusive key changes, and difference between single and multiple checkmarks (and the timing of them, whether single changed to double after a key change), and just trusts that "I heard WhatsApp is secure, so I'm good to go", then so much is sacrificed that there wasn't any point in the exercise to begin with. Except that real solid systems, with direct user control over key continuity, and fully open-source, are undermined by the confusion with these "lay person" super-convenient closed-source systems.




> catastrophic sacrifices to make security applications popular and viable for the "lay person"

This isn't wrong, but it's unfair to bring it up without the most obvious counter-argument.

PGP provides absolutely zero security to the average person, because average people don't use it. HTTPS provides lots of security to the average person, whether or not they know what the green lock means, because lots of people use it. Adoption is a feature.

Of course both of these things are true. Security sacrifices for the sake of adoption suck. But let's not paint a picture of Signal as "desperate for popularity", as though that was a selfish and not security-minded goal. Be fair.


No security is better than false security imho.


The problem is that most security technologies only provide protection against specific attack vectors and attackers under specific conditions.

Without understanding these technologies very deeply, they are all creating a false sense of security to some degree.

That doesn't make your statement false, just very difficult to apply. That's not to say it can never be applied. There are clearly cases in which people are deliberately mislead.


Yes, security is very hard.

But whatsapp markets itself as a secure system when the client just blindly accepts re-keying from the server without notifying the user by default,

It could easily have the notification on by default, and when a user turns it off actually explain that you are no longer secure.

The very best would of course be to require the users to physically exchange keys whenever they get a new phone etc, but we all know this will never happen.


I agree that there is much room for improvement. Instead of simply turning warnings on or off, they could let users enable warnings for some contacts but not others.

But my point is that the current approach is not simply "false security". It is incomplete or optional security against specific threats and not others. Depending on a particular user's expectations it may amount to false security. You're right about that. But it's not clear to me that having this sort of security is worse than nothing for the average user.

Also, you have to consider that this sort of optional and partial security used by a very large number of people allows those with real security needs to hide in the crowd. Taking a clear all or nothing approach, as you suggest, would put a bullseye on the back of those who do need security.


This is a definition of the word "catastrophic" I was previously unfamiliar with.


I think this is good healthy criticism, but I think it's also difficult to strike the "right" balance.

I believe we got Signal end-to-end encryption in all of these messengers just barely, even as "compromised" as you may think it is. Google and Facebook (Messenger) didn't even enable it by default because they thought it was "too much" encryption.

So if it was even more difficult to use, it may have never been adopted by these services.

At the same time, I don't think we should allow all sorts of modifications to the protocol and to how this encryption system works just to cover a few niche use cases that would slightly increase those users' convenience.

Sending undelivered messages when the recipient is switching SIM cards instead of just telling the sender that those messages can't be sent then is one of those niche use cases and compromises that shouldn't happen, especially if enabling such features could be turned into defacto "legal intercept".

I guess Moxie is saying here that this wouldn't be a defacto legal intercept, but I'm not so sure that's true, and the researcher that found the bug doesn't seem to agree either. I think, unlike others here, it's very likely that people wouldn't notice that the messages don't have a double check mark anymore.


> willing to make catastrophic sacrifices to make security applications popular and viable for the "lay person"

I call those "completely reasonable compromises".


>just trusts that "I heard WhatsApp is secure, so I'm good to go", then so much is sacrificed that there wasn't any point in the exercise to begin with. Except that real solid systems, with direct user control over key continuity, and fully open-source, are undermined by the confusion with these "lay person" super-convenient closed-source systems.

I totally disagree with this.

Security isn't some absolute thing, which you either have or don't have. It's a series of threats, and counters, and usability tradeoffs you have to make so people still use your service.

You can always criticise someone selling a front-door lock - "what if the bad guy smashes the window"? But that doesn't mean front door locks are bad, or that we should all move into houses without windows.

I think tech folk treating security as a binary all-or-nothing thing, without thinking about the usability tradeoffs, are a big part of the problem, and why so much of what we have is so insecure. We have these "real solid systems, with direct user control over key continuity, and fully open-source" you mention which almost no one uses.

This makes them useless. Complaining that people should know better is also useless. Shipping software which dramatically increases security against a wide range of threats, on the other hand, because 1B people actually use it, is a positive contribution. Trying to blame that same software for the lack of adoption of "real solid systems" is lame. We've had the solid systems for years, but their lack of usability always sunk them.

Does WhatsApp encryption handle every threat? Of course not. There are always going to be unhandled threats. People could still come to your house and root your phone, or hit you with a wrench until you unlock it, for example.

But there's an apparently big threat (revealed in the Snowden leaks) of governments clandestinely, passively, mass-monitoring traffic on the wire, perhaps without the cooperation of the tech companies involved, which has compromised the privacy of vast numbers of entirely innocent people. WhatsApp's encryption seems to counter this.

Moxie et al's contribution thus deserves respect, in so far as it potentially protects a billion people from that class of threat.

It's possible that, due to the closed source nature of WhatsApp, they are actually snarfing everything. That would be big news, deserving of a mass outcry, or a leak. I'm hoping the potential commercial ramifications of them getting caught widely releasing a deliberately compromised client keeps them honest.

Its a risk, but security is always about risks and tradeoffs, not absolutes; and they have built a system that's actually usable enough that it has 1B users.


I agree, perhaps a further version of the signal protocol could implement a definitive solution that better addresses this kind of scenario the way HSTS did for ssl certs. And combined with a friendly UI solution (like the new padlock | Secure string in chrome) would lead to easier detection of possible eavesdroppers by the lay person.


This isn't a vulnerability in the signal protocol. More a vulnerability in a particular UX decision that WhatsApp made (consciously, even.)


It sounds like you are saying this project's true motivating objective is to advance the author's personal beliefs about "UX" and to see wide adoption of this philosophy as embodied in some particular software. That he chose to use a large, politically-connected, centralized social media^W^W ad sales company as a distribution channel ensuring wide adoption by default. And that the author would make any trade-off in order to see wide adoption.

Should we be surprised?

If I am not mistaken the crytography here is compliments of djb. (Best "UX" designer ever, IMHO.)

The Signal author's contribution is only a protocol and some "UX". His programming language of choice was Java.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: