Hacker News new | past | comments | ask | show | jobs | submit login

If all you do is download the release binary, anything could compromise you. If you read the source, an inauthentic release binary could compromise you. If you read the source and compile it, a compromised compiler could compromise you. If you read the compiler source, bootstrap compiling it from hardware code (reading all the source along the way), read the app source, and finally compile it using your compiled compiler, then compromised hardware could compromise you.

Every step along the way you increase the severity of the statement "If I'm fucked, then so is everyone using X". You stop when the set of people using X grows so large than you are sufficiently worthless in comparison.




I agree with your explanation, but actually bootstrapping the compiler might not even be enough, as pointed out by Ken Thompson in his classical essay in 1984 [1] "Reflections on Trusting Trust.

Bruce Schneier already said that in 2006 [2]:

> It’s interesting: the “trusting trust” attack has actually gotten easier over time, because compilers have gotten increasingly complex

Since 2006 compilers have become even more sophisticated, but also much more complex, thus even harder to validate.

[1]: https://archive.org/details/reflections-on-trusting-trust

[2]: https://www.schneier.com/blog/archives/2006/01/countering_tr...


Yes, that's what the "..." represented in the OP. In fact I linked to that paper there too





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: