Hacker News new | past | comments | ask | show | jobs | submit login

It tries to be, but in practice as I say you mostly just end up with the worst of both worlds, because of perverse incentives (the software provider wants to charge a monthly figure and doesn’t want their server to be fungible).

Anyway, it’s still much more commonly a limitation than an advantage. Perhaps the most obvious example: I want to be able to search through 10GB of emails without needing to download them all (or a not-much-lighter index), because downloading it takes a long time, costs a fair bit (traffic isn’t generally), and requires that I have that much spare space on whatever device I’m using at the time.

The fact of the matter is that E2EE in all its forms is consistently inconvenient—and double racheting makes it even worse than earlier non-key-cycling techniques. In some situations it may be worth the inconvenience, but I wish there was more acknowledgement of the fact that it’s seriously not the ideal goal, and I wish the innovation would head in the direction of proper and deliberately thorough decentralisation instead, because if you run your own server, you don’t need E2EE: its purpose disappears completely, leaving only the inconvenience. (Reduce it instead to just encryption at rest and transport encryption, which are still useful and entirely sufficient.)




I know I'm biased because I'm working on an E2EE todo/planning app[1], but over the years I've become convinced that E2EE apps are the future.

All the syncing code you have to write anyway if you want your app to work offline. Despite all the buzz about big data, most individuals and most companies generate only a modest amount of data. Even when you have a long tail of archived data (e.g. emails that go back 10 years) the key benefit is in having all recent data with you, available offline, and end-to-end encrypted.

Self-hosting is a lot more difficult in practice than in theory. Somebody has to make backups, maintain a server architecture with redundancy, apply security patches, etc. For hobby purposes you can get away with just not doing any of that, but for businesses and people who don't want to tinker with their own servers outsourcing all this makes a ton of sense.

I don't agree with your claims that traffic is expensive. In any case, downloading data once and syncing subsequent deltas is still a lot more data efficient than roundtripping JSON for every user action.

[1] https://thymer.com


Ok, I'll byte. Let's take a hypothetically example that might just happen.

Imagine a company has a Wikipedia hosted by a 3rd party. I interpret E2EE as: after I submit a post, it is encrypted, and the servicing company can't look at the unencrypted data, my coworkers can decrypt the posts.

It is simple to ask "how would you support linking to posts" - but "how would you support searching for posts"?

Would you maintain a key-value lookup list? Would you handle semantic search? Are there E2EE algorithms that facilitate enhanced searching?


So, searching through encrypted text is a really interesting problem!

It's actually possible under some ciphers / search algorithms - if the client generates encrypted inverted index entries when submitting a post, and encrypts their (vectorized) search query using the same key, the server can compare them and return a scored list of IDs.

You do need to weaken the encryption scheme somewhat - in particular, term-frequency analysis becomes possible.


> Ok, I'll byte.

Whether it was intentional or not, +1 Funny.


That's looks pretty cool. Looking forward for invite. Just signed up.


> the software provider wants to charge a monthly figure and doesn’t want their server to be fungible

Sounds like yet another reason to use free software and pay services for their actual services, like storage.

> I want to be able to search through 10GB of emails without needing to download them all (or a not-much-lighter index), because downloading it takes a long time, costs a fair bit (traffic isn’t generally), and requires that I have that much spare space on whatever device I’m using at the time

10GB is pretty much nothing nowadays, and once one has downloaded it then one only has to download the updates, not the whole thing from scratch. And local search is so much faster than remote search that it is incredible. Using notmuch for email is such an improvement over the Gmail web UI; it really has to be seen to be believed.

> I wish there was more acknowledgement of the fact that it’s seriously not the ideal goal

Hard disagree here. The ideal goal is not trust in a third party; the ideal goal is freedom from having to trust third parties to do more than one can verify. I can verify that a storage provider is providing me storage (I can write data, and randomly send read requests; one might even develop bandwidth-efficient proof-of-storage protocols); there is no way for me to ever be sure that a messaging provider is not reading my messages.

> if you run your own server, you don’t need E2EE

One still needs encryption from one’s server to another’s. To put that another way: from one’s end to another’s. Or … end-to-end encryption.


> 10GB is pretty much nothing nowadays, and once one has downloaded it then one only has to download the updates, not the whole thing from scratch. And local search is so much faster than remote search that it is incredible. Using notmuch for email is such an improvement over the Gmail web UI; it really has to be seen to be believed.

10GB is more space than a very significant fraction of people have available on their devices (more even than some new devices have free). Or want reserved for stuff 99.9999% of which they will absolutely never access. 10GB is also considerably more traffic than many can spare without incurring multiple dollars of cost. 10GB may also take most of a day to download for a great many people. People might also just want to access their stuff on a new device, such as someone else’s computer.

Local search certainly can be faster, but it’s often not. It’ll normally be using different software from what the server uses, too, since the server software is likely to be factored as a server with no direct library equivalent, and my experience is it generally produces inferior results. I know, none of this is inherent, merely typical. But back to performance: you might be surprised at just how slow storage is on cheap phones. Only a few megabytes per second is realistic, and with not enough memory to keep it in there either.

When talking of all this performance stuff, I feel like pointing out that I live in Australia and am used to the internet being slow purely because of latency to US-hosted sites. This definitely tips the balance a bit more in favour of local for me.

By all means, support downloading everything and being able to work offline and doing local search where possible. Please do, because it is better where feasible and is likely to lead to the software being better in other regards too. But for most people most of the time, requiring this is a bad trade-off.

> One still needs encryption from one’s server to another’s. To put that another way: from one’s end to another’s. Or … end-to-end encryption.

NO! This is NOT what is meant by end-to-end encryption. This is transport encryption. And note that E2EE doesn’t obviate transport encryption, because it still protects the metadata.


10GB could be nothing for mobile/desktop, but very much for webapps unless pre-downloaded.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: