From wiki it looks like David's emulator is perhaps uses interpreting as wiki says Eric's uses dynamical recompilation and Connectix' is even faster so maybe more optimization.
I tried to find the source code of any without any success.
All this stuff is or was proprietary, closed-source tech, and what's more, it was tech that gave certain companies strong competitive advantage at particular points in time -- so they had strong incentives to make sure it did not leak.
(I see posters in this thread who do not know what I thought were well-documented parts of the story, so I am trying to spell out the context here.)
Some large reputable companies have histories of stealing other's code, ideas, implementation methods, algorithms etc. and passing them off as their own. IBM, Microsoft, Sun, Apple, Google, Oracle, Digital Research, Lotus -- all were dominant players, all were so accused. Most either backed down, or re-wrote, or re-implemented to avoid being sued.
Microsoft more than almost anyone, and it only thrived because it was able to pay other companies off, or simply wait for them to go broke.
Sometimes, how code works can be deduced simply by studying what it does. I worked out how Rsync worked because someone asked me to explain what it did in detail.
Powerquest's PartitionMagic was amazing, black magic tech when it came out. I didn't review v1 because I did not believe what the packaging said; when I reviewed v2, a reader wrote in accusing my employers of doing an elaborate April Fool's joke and pointed out that my name is an anagram of APRIL VENOM.
(If I ever branch out into fiction, that's my pseudonym.)
Now, the revolutionary functionality of PartitionMagic is just an option in one screen of some installation programs. It's valueless now. Once people saw it working, they could work out how it was done, and then do it, and it ceased to have value.
Very fast emulation is not such a thing. Setting aside sheer Moore's Law/Dennard scaling brute horsepower, efficient emulation during the short window of processor architecture transitions is a massive commercial asset.
Apple has done it 3 times between 4 architectures.
68000 -> PowerPC
PowerPC -> x86
x86-64 -> Arm64
Nobody else has ever done so many.
IBM bought Transitive for QuickTransit, but it's not clear how it used it. Its major architecture change was IBM i. Originally OS/400 on AS/400, a derivative of the System 36 minicomputer, it successfully moved this to POWER servers. However, there is a translation layer in the architecture, so it didn't need Transitive for that.
But IBM has bought many radical tech companies and not used the tech. E.g. Rembo, an amazing Linux-based boot-time network-boot fleet deployment tool it never really commercialised.
Microsoft bought Connectix for VirtualPC, kept the disk formats and management UI and threw away everything else, because Intel and AMD bundled the core virtualisation tech.
I know a little of the binary translation tech because the man who wrote it flew across the Altantic for me to interview him.
All thrown away, but today, it's valueless anyway.
Thanks for sharing the context! I'm mostly curious as I hope to walk along the path of predecessors. I have a bit of fetish about programming in assembly language so emulation/vm is one of the interesting fields -- and after writing a couple of simple interpreter emulators it's natural to go down the dynamic rabbit hole.
Apple caught my eye exactly because of what you said: it went through 3 big transitions in a relatively short period of time (about 25 years), so everyone involved are still active in the profession. And I'm sure they used dynamic translation extensively. I agree that all is going to be, or was thrown away, but it is interesting to know how did they do it.
BTW April VENOM is a pretty good pseudonym, somehow reminding me "Raul Bloodworth", the pen name of CSM from X-Files.
I tried to find the source code of any without any success.