This is perhaps one of the few places where software developers could learn something from hardware development: In hardware description languages everything happens in parallel by default. A large chip project has many components that communicate typically fully asynchronously over some network / bus. Tools for debugging these systems are
- A GUI that allows one to trace all state changes down to the single bit level.
- Assertions based on linear temporal logic to verify invariants.
- Conventions around how to model transactions at various levels of abstraction, as well on how to monitor and extract these transaction from the "wire protocol".
The article talks about all three, but with different names.
At the end of the day, we keep hardware as simple as possible, so it's easier to verify, and all the extra complexity goes into software, so it's easier to iterate. As a consequence, the hardware tooling is far from enough for software debugging. Software has more powerful tools, but those aren't enough either - distributed systems are a hairy problem.
I've thought about it and I don't think hardware design can map to software.
Hardware is all about individual blackbox components. Take some bits on the left, give some bits on the right. Chain an infinite amount of components. It's "easy" to layout and analyze as a linear flow.
Software is not a linear flow with defined states. How to represent a syscall or an IO? The minimal state of an application is its entire memory.
- A GUI that allows one to trace all state changes down to the single bit level.
- Assertions based on linear temporal logic to verify invariants.
- Conventions around how to model transactions at various levels of abstraction, as well on how to monitor and extract these transaction from the "wire protocol".