When a lot of software is mostly reproducible it may be easier to make the matching logic smarter instead of needing to chase down every possible build variation in every single application on Earth.
>And judgement can go wrong.
Which is why it is important to have a good design and threat model.
Definitely, and they only have a limited pool of manpower.
However, at least the problems with this approach are all in one direction: Bits differing might be due to an actual attack or because of weird compiler issues, but if you get the bits to be identical, you know that no funny business is going on.
The proposal to have more complicated rules doesn't have this upside. And you still need to make sure that the things that actually matter are the same (like instructions in your binaries), even if you successfully separated out irrelevant metadata changes.
If you have to make exceptions for some metadata, all of a sudden there's judgement and smarts involved. And judgement can go wrong.