Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's an even bigger problem.

Running through every possible x86 instruction is out of the question. Even provably creating every possible instruction scenario would be all but impossible.

Caching is a very hard problem. Add in a bunch of neural networks to control those caches and programmers aren't even sure what's actually happening (let alone actually having formal verification).

Large, Out-of-order chips use millions of lines of code to implement their algorithms in hardware. They do lots of unit testing and some formal verification of the smaller bits, but the sum of the parts isn't formally verified.

It takes a half-decade to create a chip without formal verification. With complete formal verification, we might not even have the first Pentium ready to ship yet (if ever).

While we're on the topic, very few programming language implementations are formally verified. For example, L4 was verified using Isabelle which is written in StandardML and compiled using the PolyML compiler. I don't believe the PolyML compiler is formally verified. This means that Isabelle isn't formally verified when compiled which means that L4 isn't formally verified either (technically speaking). And of course, even if all of these were verified, there's the "trusting trust" issue and the fact that the compiler must be running on a formally verified CPU.

I appreciate striving for formal verification, but code verifiable from the bottom-up is a very long way away from happening.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: