Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you can get your binaries bit-for-bit identical, any idiot (read: even a computer) can tell they are the same.

If you have to make exceptions for some metadata, all of a sudden there's judgement and smarts involved. And judgement can go wrong.



When a lot of software is mostly reproducible it may be easier to make the matching logic smarter instead of needing to chase down every possible build variation in every single application on Earth.

>And judgement can go wrong.

Which is why it is important to have a good design and threat model.


In practice, chasing down the build variations has been proven workable with a good track record. See eg https://reproducible-builds.org/


>workable

That page has progress reports that are 8.5 years old and the project is not close to be done. There are still thousands of Debian packages to go.


Definitely, and they only have a limited pool of manpower.

However, at least the problems with this approach are all in one direction: Bits differing might be due to an actual attack or because of weird compiler issues, but if you get the bits to be identical, you know that no funny business is going on.

The proposal to have more complicated rules doesn't have this upside. And you still need to make sure that the things that actually matter are the same (like instructions in your binaries), even if you successfully separated out irrelevant metadata changes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: