Every time I see this stuff, I really wish people who understand cpus at this level wrote a modern version of Nand to Tetris, except with a modern CPU and the OS being essentially a micro linux kernel that includes things like networking and graphics.
That would be a non-trivial undertaking. Modern CPUs have a lot of silicon dedicated to branch prediction and fetching RAM to populate caches and so on.
Fetching the data that the hardware threads want so they can do work is a very large portion of what a complete modern day CPU does. Networking and graphics are also complete career paths each in their own right.
When the 6502 is your CPU, it is possible for the entire computer it powers to be understood by a single person, and that's not possible with modern day, high end CPUs, not at the same level of detail. A single self-paced course which covers it all probably isn't realistic, but one for CPU, one for GPU, and one for network, those seem like workable things.
"NAND to Hatris" (NAND to Tetris sequel/expansion?)
Gate-level simulation (even zero-delay, let's not mention delay-aware or even power-aware) for a modern-sized CPU takes weeks just to run through some basic liveliness checks. See [1] for just a taste of gate-level simulation trickiness:
You can do an RV32I in 10k gates, RV64GC in maybe 30k gates? I think the GP meant barely enough to run a thin OS, but not an antique. In-order, small to no cache, you get it.
A visual RV64GC would be a pedagogical tool, not something necessary for a tape out.
ARM1 is from 1985. IANAL, but browsing https://en.wikipedia.org/wiki/Integrated_circuit_layout_desi..., I think any design protection has expired (“the term of protection is at least 10 (rather than eight) years from the date of filing an application or of the first commercial exploitation in the world, but Members may provide a term of protection of 15 years from the creation of the layout-design”)
I also expect patents to have expired after over 30 years.
There's zero microcode. You can go into the StrongARM period and beyond before you're hitting anything like microcode. It's anathema to the basis of RISC CPUs. IIRC, you'll only start finding it when there's a need to emulate earlier ISAs.
When you run the simulator, it executes a short hardcoded program that performs shifts of increasing amounts. You don't need to understand the code, but if you're curious it is:
This is great, though I do sometimes think too much attention is paid to ARM1. This was, fundamentally, a technology demonstrator. The CPU that shipped (in the Archimedes) was the ARM2, and it did so a year and a half later and more or less simultaneously with the more famous Sun-4 and SGI Iris 4D boxes.
The ARM1 is really a super-clever circuit and worthy of study. But... so were Berkeley RISC (which begat SPARC) and Stanford MIPS (you can guess), which beat ARM1 to the punch by a year and also produced working (in a lab) silicon.
And those two were just evolutions of ideas from the IBM 801. And of course the whole paradigm was Really Invented by Seymour Cray.
Sometimes I think the story of ARM gets a bit spun. Really it was the Acorn products that were notable, not their CPU design so much.
The ARM2 wasn't tremendously different from the ARM1.
And the ARM was the major differenciator for their products, unfortunately. It was the beating heart of the Archimedes and RISC PC and the one differenciator they had. Because the MEMC and VIDC weren't, and RISC OS, however nice it might've been, was hobbled by a bunch of early design issues that kept it from making the transition to preemptive multitasking. Never mind the pain of the ARM26 to ARM32 transition. I mean, I could go on.
ARM gets credit for being the first properly successful RISC processor, way beyond what MIPS and POWER achieved. It proved that RISC worked by delivering powerful machines that consumed a fraction of the power. They didn't need to be first to do that, they just needed to do it better. The first person to do something is rarely the one to get credit because they probably didn't get the design right. Acorn didn't get it quite right, but they got it right enough to make major waves.
And as much as he deserves respect, Seymour Cray wasn't the progenitor of RISC. If anything, his designs ran contrary to them as he was so fixated on building machines with essentially discrete logic.