Only if the architecture itself is unsafe and does not enforce timeline consistency.
It's relatively easy and simple (on the scale of programming a black-box automaton that runs our universe) to code a completely asymmetrical n-threaded system where some threads are millions of in-simulation years ahead of the others and yet the "final" timeline as would be output in a timestamp-sorted log would be identical to that produced by a single-threaded frame-synched system.
It's not even all that hard by our modern standards; some software scientists do it all the time.
So no, the analogy wasn't assuming a single-threaded implementation. It was merely assuming that the system was designed for a self-consistent (and probably loopless) timeline, which has strong evidence for being a property of the kind of spacetime we live in. One such piece of evidence is the ever-mounting empirical evidence reinforcing the c limit on information transfer.
In an unsafe, approximation-riddled system where causality graphs are only searched for modifications up to a bounded depth or item count, yes, some discrepancies could be noticed. However, it's empirically evident that this bound, if it exists, is far greater than everything we've ever been able to calculate.
The Universe, whatever it runs on, routinely calculates graphs to perfect accuracy that would take us millenia of the sum total computing power on earth to solve accurately enough that we couldn't tell the difference in the result with our current instrumentation.
It's relatively easy and simple (on the scale of programming a black-box automaton that runs our universe) to code a completely asymmetrical n-threaded system where some threads are millions of in-simulation years ahead of the others and yet the "final" timeline as would be output in a timestamp-sorted log would be identical to that produced by a single-threaded frame-synched system.
It's not even all that hard by our modern standards; some software scientists do it all the time.
So no, the analogy wasn't assuming a single-threaded implementation. It was merely assuming that the system was designed for a self-consistent (and probably loopless) timeline, which has strong evidence for being a property of the kind of spacetime we live in. One such piece of evidence is the ever-mounting empirical evidence reinforcing the c limit on information transfer.
In an unsafe, approximation-riddled system where causality graphs are only searched for modifications up to a bounded depth or item count, yes, some discrepancies could be noticed. However, it's empirically evident that this bound, if it exists, is far greater than everything we've ever been able to calculate.
The Universe, whatever it runs on, routinely calculates graphs to perfect accuracy that would take us millenia of the sum total computing power on earth to solve accurately enough that we couldn't tell the difference in the result with our current instrumentation.