Hacker News new | past | comments | ask | show | jobs | submit login

This denial is a lot stronger than Page's, which was full of weasel words. Unless Zunger is just lying, which somehow seems unlikely (forgive my naiveté), it'll be interesting to know how PRISM actually works. But I will say that his faith in his company sounds misplaced, since Google's more official denials are so weak.

edit: clarity




He hints at a couple of reasons for his confidence in his post.

One of his assumptions was that people would notice surreptitiously installed hardware or software doing the monitoring. I don't think this is unreasonable. Hoovering up all the private data in Google is bound to be a big job, regardless of whether you are sieving it on site or transferring it off site. Even if it would only take a few people to install and manage, everyone else would probably be tripping over it constantly. Accessing all of Google's data means you'd be tied into nearly every system.

It's possible to keep projects secret inside a large company like Google, but only if an exceedingly few number of people know about it, and have good reasons to not tell anyone else. A system with as much surface area as you'd probably need to hoover off all of Google's data would be vulnerable to almost every engineer and datacenter worker accidentally running across while debugging other problems. Even -- especially? -- if they didn't realize the significance of it immediately, they'd probably ask their coworkers about this weird extra code/jobs/hardware, and pretty soon everyone would have heard about it. Once everyone inside of Google had heard of it, I think you'd have a very hard time keeping the secret from leaking into the outside world.


Right, that mostly paraphrases his arguments. But it doesn't convincingly eliminate a few possibilities for how this works.

One possibility: Google obviously has some capacity to honor search warrants and NSLs. And presumably that involves some technical artifacts somewhere: admin-level API access to data and some sort of external endpoint through which the government can actually make those requests. So that's all stuff we can confidently say is already there humming along happily, whether used for nefarious purposes or not.

OK, so given that those exist, how much volume do they support? How hard would it be to modify them to bypass the scrutiny process? Or give the NSA access to those endpoints instead of just domestic law enforcement? In other words, these are just changes to the process, totally invisible to anyone without explicit access to it. It might not involve any weird hardware at all, and could operate with very few people in the know.

Another possibility: Google handed over its TLS keys and just let the taps happen upstream.

That's why the confidence of a senior person that there isn't fishy hardware running around makes the question of how PRISM works more interesting. But it certainly doesn't make the project impossible.

Edit: removing distracting aside


I don't think I can guess about how well whatever the search warrant APIs are scale to "look at everyone" without speculating overly much about the design of the system. I would argue that, regardless of the access method, if someone is looking at the terabytes or petabytes or yottabytes or however many bytes of emails there are, you're once again back to a huge amount of network or CPU or whatever utilization. Eg, even if the database access is allowed and off the record, the database admin should still be wondering why the load is so high, as if everyone in the world were reading all of their email simultaneously. And then you're back to gossip and everyone internally knowing about it.

The TLS keys are an interesting angle. Are TLS keys typically one (small set) per site, or would each server typically generate its own unique keys? Even with the latter the surface area might be small enough within Google that no one would accidentally stumble upon it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: