Hacker News new | past | comments | ask | show | jobs | submit login
Global Encryption Day: Demand End-to-End Encryption in DMs (torproject.org)
168 points by pabs3 on Oct 22, 2022 | hide | past | favorite | 74 comments



Remember when Zoom claimed that meetings were E2EE yet you could join the meeting by phone and no one batted an eye for at least one or two years? Noticed how no regular person cares when the "security code" of a chat partner changes in WhatsApp or Signal? Not to mention no regular person uses self-compiled apps for that, even if it were possible. E2EE is close to becoming a cargo cult, because done properly key management and identity verification are a usability nightmare. It will always remain a toy for a niche audience. Most people just don't care, and the non-technical people that do care are happy with a green lock appearing somewhere. I do hope to be proven wrong, though.


So what you're saying is, we can just put a green lock in the UI and call it a day.


We can put some thought into how to set up the third parties to be trusted with our keys. Curretly it's very haphazard. And it's not avoidable. Even Phil Zimmerman, inventor of PGP, won't accept PGP encrypted mail because he claims to have lost his private key.

The outrage "own your private keys or bust!" is much easier tho.


Do you happen to know which OS Phil was using, when he lost his private key and what the exact circumstances were? This is a nice story to be told, but without details it is not worth that much.


Why does it matter? Private keys are being compromised or lost under all kinds of circumstances and regardless of the OS.


Green lock it is.


IMhO, one company tacled that very problem head on and did a fantastic job and it's my by far prefered chat (+ more) app today. Ironically, that company, https://keybase.io, got acquired by Zoom. Keybase still works today (thankfully), but there are AFAIK no guarantees for how long.

Keybase Chat is fully encrypted, persistent, and cross platform, and has the best solution to the identity problem.


This. Would love to see Matrix, Signal and others adopting/directly using keybase.


Cargo cult? No. It's not perfect but end-to-end encryption is demonstrably better than whatever we had before. WhatsApp encryption is the same as Signal's and it has already defeated police and courts here. Some judges got so pissed off they ordered WhatsApp to be blocked nation-wide.

We don't live in a perfect world. I'm glad that everyone I talk to is using something this secure by default.


What I find intriguing is that E2EE was significantly more common long ago than it is today. Multi-protocol chat clients would utilize the OTR libraries meaning 'off the record' and even auto-negotiate with folks over AIM, MSN, ICQ, IRC and others to assist in showing fingerprints and sharing public keys which could be done over the platform or out of band if one so wished.

I would have expected that by today that not only would this be more common but that the technology would have significantly advanced by now. Instead the opposite appears to be true. People are instead using apps that pinky promise to E2EE things and depend on the chat service to handle the trust mechanism for them which in my view entirely defeats the intent and purpose of E2EE. I assume the motivation is the desire to capture and monetize every conversation, especially the private conversations and/or to comply with national security letters. I've heard people try to rationalize this with because non technical people but non technical people were utilizing OTR just fine.

What could realistically be done to reverse this in a way that puts the control back into the individuals hands? i.e. Even if all the kings horses and all the kings men wanted your messages they could only pound sand again and again.


I used OTR, but I would not say that it was _more_ common, as that was just a layer over a non E2EE platform like the ones you listed.

Using OTR meant using a non-standard client, installing the plugins for it, and configuring it. Worse, all parties you wanted to chat with had to do it as well. Which means it was almost always a novelty feature among nerdy friends and no one else.

Because most people did not have it installed, having it auto negotiate always was a non-starter. It meant people who didn't have it would recieve cryptic messages from you at the start of each session and then complain to you.

So most of the time it was completely manual process of enabling it for _some_ chats, and this leaks a massive amount of data. It basically declares that for this particular chat sometime might have been interesting. Often the chat log would include a plain text message like "Let's switch to OTR".

Unlike you I could not get any non-technical friends to care about OTR enough to install it. I have difficulty even today getting people interested in Signal. I still hear arguments like "I'm not that interesting, they can look" and "I've got nothing to hide"


Because most people did not have it installed

In my circle of friends, everyone used a non standard client so they could customize the skin of the client. Most of these people were either not technical or at least just technical enough to follow simple instructions. I personally knew many people that used the multi-protocol chat clients. Their primary interest was for the customization capabilities of the client and having all their messages in one place rather than encryption plugins. The vendor clients did not have a dark mode and were nowhere near as customizable. The E2EE aspect was just a nice to have for some.

I suspect this will never be a thing again as vendors might decide to ban accounts that use a client other than their own and people would fear losing their social circles in the vendor locked in services. That is, outside of IRC. One can still use E2EE in IRC.


This may be due to policy activism on the side of governments.

I think asking service providers to use more E2EE is not the right approach. Governments' intelligence agencies do not shy away from infiltrating commercial vendors (from Yahoo! to the infamous Crypto AG - https://en.wikipedia.org/wiki/Crypto_AG).

1. People should diversify across many, in particular smaller platform, so that surveillance cannot be done as it becomes a scale & long tail problem.

2. People should use P2P E2EE protocols and GPG-encrypted emails. The single biggest step forward would be easier ways to set up public key encrypted email part of e.g. Mozilla Thunderbird and other clients. But in parallel an attitude change is needed, for the receiving end needs to know how to open an encrypted email.


> What I find intriguing is that E2EE was significantly more common long ago than it is today.

This is absurd. Today a large fraction of the world's population is using E2EE via WhatsApp.


Today a large fraction of the world's population is using E2EE via WhatsApp.

That is good example of the problem I am describing. People are using E2EE created, deployed and maintained by WhatsApp in WhatsApp. That is a problem. The E2EE in WhatsApp is not truly E2EE if it is maintained by the very people providing the service in my unwavering opinion. True E2EE is entirely outside of the service transport that messages are traversing meaning that FB could not possibly intercept the messages even if their livelihood depended on it. Today people have to just trust that FB are not targeting people with custom intercept code or code that otherwise preclude E2EE for specific messages or recipients. That is what I call a pinky promise sometimes also referred to as a Pinky Swear [1].

I follow the logic of, people will do what people can do. If the application can be monkeyed with, it will be. Message encryption must be entirely outside of the purview of the application. Even OTR was somewhat at risk of interception. That is why I would have expected that by today this would have been a solved problem and highly evolved.

[1] - https://en.wikipedia.org/wiki/Pinky_swear


There are a few reasons why I think it has to be at the app itself.

In order to be actually secure, all conversions must be encrypted, without exception.

OTR is one channel method of encrypting text, but it isn't the only method. For example using PGP over text messages is also a plugin for pidgin. Competing standards means your ven diagram of people and chat protocols now gets an entire new axis of encrption method.

Metadata is data. Without seeing the message content, it is still valuable to see who is talking to who and when.

There are always tradeoffs. While OTR may be more verifiable secure, it's difficultly hiders adoption. A balance has to be reached with ease of use and security. If it is easy to get it wrong then people will have a false sense of security. That is strictly worse than no actual security.


There are a few reasons why I think it has to be at the app itself.

I agree it needs to be in an app, just not the app that is created by the service the person is using. Missing today is a universal chat app that can speak to all the services using standard chat protocols and standard authentication mechanisms. All the popular apps today appear to be highly proprietary and in some cases the vendor will even state that using an unapproved client is forbidden.


And yet WhatsApp does have end-to-end encryption and everyone in my country is using it. Not perfect but it's demonstrably an improvement over everything else that came before. Courts were unable to compel WhatsApp to reveal messages.


None of what you lament has been lost. OTR still exists and is used by some folks. There are still OTR supporting clients that also support the current proprietary messaging systems like Facebook. The only thing that was lost is that most people moved away from using Free Software for communication and most new people were born into and grew up within proprietary communication ecosystems.

Please note that OTRv3 uses significantly outdated encryption. There is OTRv4 but it isn't finished yet.

https://bugs.otr.im/otrv4/otrv4 https://github.com/otrv4/otrv4/


ICQ and the like threatened to close your account when you used OTR.


As I understand it, if we had true end-to-end encryption, I would have to make sure I kept a set of keys, copied them between every computer and phone I used for chatting, and if I lost those keys I'd lose all my messages?

Honestly, for most people I don't think that's functionality they would want, at least without us getting much better at interfaces and usability. Standard ways of storing keys, for example in a password manager, would be a good start (does this already exist and I've missed it?)


> if I lost those keys I'd lose all my messages?

Not neccessarily, that just happens with bad implementations (i.e. most of them, sigh).

If you get a confidential letter in the physicsl world, you open it, read it, and then store it in a safe,.or a locked drawer, correct?

The software world chose to "re-seal" the letter in its envelope again instead. So if you loose your key, you loose access to the letter. The proper way to implement this would be to store the decrypted contents in the software equivalent of a lockable drawer, e.g. an encryped disk, folder or whatever. Which could (and should) have sensible fallbacks to retrieve its content.


Most people don't have encrypted drives with fallbacks to retrieve its contents (whatever that means).


Not having encrypted drives is some kind of misfeature nowadays, especially for mobile devices.

Fallbacks can be key escrow (i.e. put a printout or a physical key into a sealed envelope and deposit it at a family member, friend, or notary), and backups, encrypted with a different (or more than one) key.

https://en.wikipedia.org/wiki/Linux_Unified_Key_Setup for example allows the use of multiple keys, so a backup could use the same (primary) key as your drive and some secondary key(s) to access the backup if the primary key is lost somehow. As I mentioned in another comment here, keys should have been physical features for a long time, but hardware vendors would had to standardize on a general implementation and as we know, they all like to "standardize" on exactly their way of doing things.


> and if I lost those keys I'd lose all my messages?

Only if the developers of the software insist on encryption at rest in addition to transit encryption.

For example, you don’t lose the ability to open files you have already downloaded via HTTPS just because the client or server certificate later expires.


Using a password manager in a sensible manner already goes a long way! Most synchronized E2EE services only ask for a master passphrase to encrypt your keys before storing them server side (e.g Bitwarden, Keybase, ProtonMail). Then you only need your password manager to recall the passphrase when synchronizing a new device!

People need to learn again how the devices in their pocket work and the risks for not doing things properly!


IMHO, for most people the best interface would be a physical key (with software keys on it). That is something like a smartcard or ubikey, and even a way of obtaining multiple copies, like with physical keys.

My employer rolled out a smartcard based PKI about 15 years ago, where you may own more than one card (optionally in sim card size for usb tokens) so, for example, you can have one in the office and have another one in your home office. Works quite well (modulo mobile) for now abour 30K employees. And, combined with RFID on the cards, you can call elevators, open doors, pay for food, etc.

Now if hardware manufacturers had standardized on appropriate hardware (chassis, keyboards, smart devices…) some 20 years ago, instead of trying (and failing) all kinds of software "solutions" … and thinner and thinner hardware, so no such cards or sticks would fit anymore …


The difference is: If you lose your smartcard, it's pretty straight forward for your employer to verify your identity and issue a new one.

How do I do that with WhatsApp? We all know how well these tech giants react if you got permanently locked you out of accounts, no matter whose fault it was.

And even if I could get a new key, I would have to explain to all my contacts why my key changed and they would have to re-verify it over a secure channel. It doesn't scale.

Now you might say: just keep multiple copies, but there will be cases where people lose all copies, and depending on how much of our digital lives will be tied to these keys, we will need a secure recovery plan for that.


The secret bits can be protected by a strong passprase (which you can print out) or kept in a hardware device that also does enough of the cryptography to allow the secrets to stay on the hardware device. After that the problem becomes one of backup. You can scatter your encrypted keys all over the internet if you want (or just on a reliable server somewhere). You can have a bunch of hardware devices stored in various safe places and protected in various ways.

You often hear advice that comes down to keeping, say, your PGP keys in one secure place, but that is a terrible idea... E2EE encryption needs a good backup approach. If your system does not provide that in a easy to use way for the user then you have a bad system. E2EE encryption is hard but it doesn't help to leave all the hardness for the user.


> As I understand it, if we had true end-to-end encryption, I would have to make sure I kept a set of keys, copied them between every computer and phone I used for chatting, and if I lost those keys I'd lose all my messages?

Unless you made an unencrypted backup, then yes, that's true and that is the reason why Telegram decided against E2EE by default. According to them their users prefer easy cloud access to their messages over security.


> According to them their users prefer easy cloud access to their messages over security.

And likewise with e-mail, PGP et al. break some features that rely on your mail server having access to the full message contents, like content-based message filtering (spam or otherwise) and server-side search.


How do you know if the Signal client running on your phone right now doesn't include a backdoor? Sure it's open source. But how do you know how it was compiled?

What if someone changed the open source before shipping it to the app store?


This is called "Reproducible Builds".

https://signal.org/blog/reproducible-android/


Reproducible builds are for developers. As a user I didn't build the app on my phone.

I have a phone with Signal on it. Tell me what I should do to verify it's running the open source Signal code.


You should check out Session. Their CTO apparently uses his PGP key to sign every release https://twitter.com/session_app/status/1514108746854985730


If you, as a user, are concerned about reproducibility, you are no longer an average user. Thus, if you want this extra security, you can be expected to check the APK on your phone.


Not perfect chain of custody but could report to virustotal (virustotal.com) and compare in a sandbox:

https://play.google.com/store/apps/details?id=com.funnycat.v...


Maybe you could figure this out yourself, and share your findings, rather than demanding answers from others?


Reproducible builds benefit the user by allowing independent checks of the software.


That page says:

> the Signal Android codebase includes some native shared libraries that we employ for voice calls (WebRTC, etc). At the time this native code was added, there was no Gradle NDK support yet, so the shared libraries aren’t compiled with the project build.

Also, assuming you trust the client, how to tell if the Signal server is running the published code, especially given Signal's track record of (not) publishing its source code?

https://linuxreviews.org/Signal_Appears_To_Have_Abandoned_Th...


Signal server is explicitly untrusted in the Signal threat model, which is must be due to being based in a country (like any other country) with laws that can be used to compel actions on the server's owners. They publish legal orders they receive and their responses.


A related project that is also necessary for this, Bootstrappable Builds:

https://bootstrappable.org/

Otherwise somewhere in the chain you are relying on binaries of unknown provenance.


How do you know that the AES instruction set on your device's processor doesn't include a backdoor? Sure, the algorithm is public, but how do you know how it was implemented?


AES starts with 16 bytes and then encrypts it to 16 bytes. So there is no good place to hide extra data. Even a single bit that did not meet the AES spec for the key in use would produce complete garbage at decryption. So attempts to leak the key would at least leave a mark.


Unless it’s a timing based leak..


I would say it’s the usual.

If someone with skills X (where depending on your knowledge and precautions, X can range from script kiddie to nation state) is after you specifically, you can only make their job harder, but unless you are very serious about security, you’ll probably get pwned.

If you want general security, you can probably take it as given that someone checked the Signal build to be the one that the source is available for, and that no one intercepted just your download. But you still have to take some parts on faith, always, unless you build your own CPU and continue from there.



Kind of meaningless if you can't trust the software running on your device though, since it could be scanning locally or relaying to remote services.


This is an instance of the trope "if you can't solve everything, you shouldn't solve anything".

It is fallacious because you'll never get there if you're not allowed to make incremental advances.


Exactly. Demand an open source for every encryption app - or at least those offered to the public en masse.

It's not enough that a FOSS alternative _exists_; it needs to be the case that closed-source encryption is not considered as an actual encryption "end".


So run free software?


Every free software will have dozens or even hundreds of transitive dependencies.

It literally isn't possible for an ordinary person to audit all code.

At some point you have to blindly trust.


There us a huge difference between you alone trusting a piece of software and the whole community verifying it at random.


Are you confident enough to audit the free software yourself - or pushing the trust back to someone else?


Our modern society couldn't exist without some trust, but there are huge differences in types of trust and the trustee's underlying motivations.

Trusting the community to audit is like trusting the scientific method. Anyone can find and point out a flaw, which can then be verified by everyone. That's an idyllic description, and the process is quite imperfect, but it's the best we've got.

Meanwhile, trusting a surveillance company to self police is like trusting a quack medicine healer.


It isn't enough to run free software, all your contacts need to do that as well. It is the old gmail problem highlighted by Mako Hill:

https://mako.cc/copyrighteous/google-has-most-of-my-email-be...


You are right: You also need to promote free software among your friends and help it to improve.


Like signal that still refuses to put their client on fdroid?


Like Element which does publish its client on F-Droid: https://f-droid.org/en/packages/im.vector.app/

(Yes, F-Droid availability is a very good cutoff, I agree.)


I do not consider Signal trustworthy for such reasons.


Briar Project appear to be a good E2EE messaging app.

https://briarproject.org/how-it-works/


Which is of course entirely incompatible with everything else. We have tons of good E2EE messaging apps, all doing their own thing...


How would you suggest solving that? one client that support most of them?


It's mind blowing e2e is not a standard. I guess the equivalent is looking back and realising cars and homes did not have locks at one stage


Given how everyone uses TLS nowadays, a better analogy would be for the dealership to retain a copy of the key to your car after you buy it. Which would probably upset people but isn't as bad as having no lock at all.


The analogy breaks down a little here.

We don’t want the dealership to keep the key, but we do want them to give us a new one if we lose it.

So afaik we trust the manufacturer to keep the info, and to keep sufficient logs that anyone that abuses it to steal cars can be caught.

I don’t think the way this is works can be extended to communications apps though.


Mail and GPG. For IM's, tox. Or Jabber+Omemo.


Why not jabber+gpg?


Lacks PFS and doesn't necessarily support groups.


Which is why many of us are firmly against RCS.

End to end encryption is critically important and no messaging standard should exist that doesn't include it.


RCS does include end-to-end encryption though.

https://support.google.com/messages/answer/10262381


No, it does not. Google’s proprietary implementation implements Google’s proprietary encryption system. RCS does not.


While this is currently only in Google's and Samsung's Messages apps, it's basically just the Signal protocol implemented as an RCS extension. It will most likely end up formalised in the next standard release of RCS.


> It will most likely end up formalised in the next standard release of RCS

Based on what evidence.

If it hasn't been added after all these years it doesn't seem likely it will be added soon.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: