When 37 signals start encrypting all their data their search tool is really gonna suck.
A backup service that just needs to move around opaque blobs can and should encrypt its data, an application that needs to be able to react to the type and contents of the data that is stored, not so much, it seems like cperciva would know this more than anyone, so the post seems pretty disingenuous
I don't disagree with this at all. But having the perspective that "The answer isn't to prove that they can be trusted; the answer is to ensure that their customers don't need to trust them" is worth keeping in the back of your mind...because I'm sure there are cases when that approach can be taken without breaking features.
Regardless, even if the load was higher like it use to be before current modern hardware, you are still essentially informing your customers that "speed is more important than securing their data" - which is a terrible approach to take.
TL;DR: If you are given the privilege to maintain a customer data, it's your obligation and responsibility to do so with the most care possible.
The problem isn't the cycles of encryption, it's that the data architecture has to be designed differently, and optimizations like caching can't be used as much. Document search is especially difficult on encrypted documents.
did you reply to the wrong comment? my speed mentioned nothing about speed.
Tarsnap can treat data opaquely and have the client encrypt / decrypt it, most web applications that arent just moving data around need to be able to access its contents to be able to work.
A backup service that just needs to move around opaque blobs can and should encrypt its data, an application that needs to be able to react to the type and contents of the data that is stored, not so much, it seems like cperciva would know this more than anyone, so the post seems pretty disingenuous