Hacker News new | past | comments | ask | show | jobs | submit login
FPGAs and the renaissance of retro hardware (brainbaking.com)
179 points by surprisetalk on Nov 27, 2023 | hide | past | favorite | 53 comments



Retrocomputing FPGA work is a fun diversion from normal software, enough that my brain was convinced they weren't related (had a bit of a mental block for software). Over the course of a year, I went from knowing basically nothing to releasing 3 different FPGA emulation cores of my own for multiple platforms, along with releasing something like 5 ports (which is not necessarily trivial, particularly for a beginner) of existing cores to the Analogue Pocket.

It has been a very fun experience, and I've found it to be extremely addicting. It helps that there's a fairly tight-knit community very interested in furthering the development of FPGA hardware preservation, so people are very willing to donate, test, and contribute feedback, which is a great feeling for open source work.


> Retrocomputing FPGA work is a fun diversion from normal software, enough that my brain was convinced they weren't related (had a bit of a mental block for software).

That's awesome! I feel this, I've had a software development mental block for a number of years now. I just don't find modern software all that interesting anymore. Lost in mountains of model mapping, layers of terrible abstraction, that never ending package update grind (shudders), bad APIs, closed won't fix works as designed bugs (sigh), truly insane complexity and so many many things that are simply outside of my control.

It's my interest in related, but different areas that has kept me engaged recently: micro electronics, 3d printing, and home automation. They exercise enough of my decades of programming experience to get that fix, but the projects are small and focused on solving very concrete problems instead of moving a decimal point on some spreadsheet somewhere completely disconnected from me. It's great when you make something for a friend and you can see the joy in their eyes as they realize how much this thing you made helps them.

Sounds like FPGAs are doing that for you and that makes me happy!


Totally. There's something therapeutic about being able to exercise our engineering brain cells. Main stream software development simply doesn't do that for me anymore. Hardware/embedded work is the only way I can get this feeling.


Although operationally there's not much of a difference between a cycle-accurate FPGA implementation and a cycle-accurate software implementation (especially on a board that you can plug into a hardware CPU socket[1]) the FPGA implementation is interesting to me because it seems closer to the original gate-level implementation in hardware, and because it seems more tangible. Of course a custom silicon implementation (such as a recent HN comment about a tiny tapeout reimplementation of the PDP-8[1]) seems even more real and exciting, even if the cycle/signal timing is the same. Part of it may be that the custom silicon implementation is a self-contained reimplementation rather than an emulation based on pre-existing, complex components.

[1] https://microcorelabs.wordpress.com

[2] https://news.ycombinator.com/item?id=38416886


In theory yes, but in practice there's many sources of latency introduced by a modern system with OS, and even an emulator running on bare metal has additional latency. Framebuffers and input are the main sources, and there's no real parallel in emulators, unless you're going to multithread on dedicated separate cores, dedicating one to input, one to video, one to CPU, etc.

Now for the vast majority of users, this does not matter at all. As much as I like to think I can tell, it's probably placebo or the slightest feeling like something is not right. But I think it's a worthwhile reason to have a different method of replicating these old and eventually dying machines, and it's much more intellectually interesting as you say.


I was intrigued by the microcorelabs examples since they seem to be drop-in replacements for the hardware and are capable of running timing-dependent games and demos. Basically equivalent to the hardware, with the same input and output signals and timing. Note the 68K emulator was implemented on an Arduino-compatible Teensy microcontroller board rather than a full-blown Linux PC - though it does support 8GB of RAM, much more than the Mac it was plugged into!

If a device has the same input and output signal timing as the original hardware, then the implementation - be it custom silicon, FPGA, or software - is largely irrelevant to its functionality.

Regarding parallelism, it doesn't really matter as long as the output signals settle in time to meet the clock boundary.

I agree completely regarding the difficulty of getting input and display timing to work properly in a software emulator, especially running on a PC with a modern OS.


MB, darn those autocorrecting fingers ;-) I guess potentially the Mac in question could support 8MB, perhaps via a similar CPU daughterboard; of course now people complain about 8GB not being enough in entry-level Mac configurations.


Interested in knowing how you got started, any prior knowledge/training, as that seems quite the hill to climb?


I _technically_ had prior knowledge as a computer architecture class had us stick some premade pieces together to create a CPU we designed, but I personally wrote no Verilog, and it was a small subset of the class.

I don't have much documentation for getting started with HDLs (Verilog, VHDL, etc), but I have tried to document my process as much as possible. I have primarily developed for the Analogue Pocket, so my documentation is themed towards that device, but there's IP (code modules) and wiki entries that would be useful for everyone: https://github.com/agg23/analogue-pocket-utils

I had previously written a cycle accurate NES emulator, so I was familiar with hardware techniques, but not what they look like in circuits. The first core I wrote was a schematic accurate Pong implementation. This was both good and bad because it's very simple and has no CPU (and thus no code), but it also makes it very hard to tell what is going on. I went from there to doing a lot of ports (NES, SNES, PCE, and a few more), and after that I worked on my own cores (Tamagotchi, Game and Watch). Tamagotchi I took a very typical software approach where I wrote massive amounts of unit tests and wrote against those tests. While this is what real hardware developers do, I found it to be a huge waste of time when you're working by yourself on a small project.

I, and a few others, are very willing to help people learn (though I'm still really a noob). If you want to play around in this space, let me know and I'll try to help you with what you need.


On the note of my sibling comment, a common book in this space for learning is "Designing Video Game Hardware in Verilog", which is specifically retrocomputing related.


I second this book recommendation! I just started it exactly to get a foothold on FPGA programming. It's very accessible.


with what FPGA hardware is better to start for first tries?


The book actually has an accompanying emulator where you can run everything in browser!


My recommendation is the NAND2Tetris coursera course. You start with a single NAND gate and put together more complex gates building upon the previous abstraction. Part 1 covers things like creating Muxes, RAM, and an ALU in a stripped down HDL. Part 2 covers the software stack picking up after having created an assembler but I haven't taken it yet. Perhaps this is isn't tailor-made for jumping into FPGA fast but it helped fill in a lot of blanks for me coming from mostly software world.


I really wish to play with those things because I was playing with hardware when I was a schoolboy and found it back then very entertaining.

I’ve built my first computer with soldering all the parts and then started debugging it with oscilloscope to see signals from chips and analyse them to find the problem. And in doing so I have quickly realised that I am missing something. This something was ‘How chips actually work’. Turned out it was called Digital electronics so I’ve decided to learn this on the way.

My treasure and source of inspiration at school time was this book: Digital Electronics by Roger Tokheim. I was dragging it to school and back every day just like people cary notebooks these days. This was my bible back then. Boys in the school made fun of me for this. The book was amazing and I think it still is.

I remember it all as very exciting time.

Now FPGA seems like a nice opportunity to revisit all of this after many years of programming and developing a new point of view about many paradigms.

May be you can direct me and others like me toward a good community and tips for shortening a learning curve. Possibly many things I am familiar with already and yet with FPGA I didn’t find a good/easy way to start so far. Perhaps you can advice something about ‘how’ and ‘were’ to begin.


This is retrocomputing specific, but this Discord server (https://discord.gg/3wv3gMhp) has quite a few of the big devs in the FPGA gaming space, and they're more than willing to answer questions. They're what really got me going after I built my Pong core.

I have always been a big advocate of learning while doing, especially in software. Find something, preferably small, that you want to build (your Pong), and work on making it a reality. Maybe it's making Snake entirely in HDL. Maybe it's playing around with LiteX on your preferred development platform so you can build something cool with the RISC-V processor (I don't suggest this though, start with learning a normal HDL). Maybe it's simply looking through one of the existing retrocomputing cores, trying to figure out how stuff ticks.

Until you get to the CPU design level, the general concepts you'll encounter will be fairly simple. I think it's enough to just play around with blinking lights, learning how parallel synchronous logic works, relative to how we think of software working.


Thank you very much. I do prefer to start with small. This way I made whole emulation of my first computer showing assembler commands as it executes each operation and it was able to run original ROM successfully. It was a pleasure to observe. Hopefully I can reproduce it in FPGA to make the circle complete ;)

My challenge is to play with CPU designs at some point but in the beginning I agree it is easier to start with something like led blink, nand , register and then to other staff.

Is there some FPGA hardware you can advice to use for easy start? One that potentially(once I learn) capable to do let’s say UART or USB ? I do not know really how much of FPGA power required for those. Or it would be too big/expensive ? I can’t orient myself with this yet but I wish something physical to have for real motivation to ignite.


It really all depends on what you want.

It may sound like shilling, but the Analogue Pocket is actually an excellent development platform because it's everything all in one; you don't need to figure out how to take input, load data, or output to HDMI, because that's all taken care of you automatically. Now you pay for that in having a $200-250 price tag, plus ~$50-70 for your JTAG adapter, but I think it's excellent for starting.

For OSS, I would probably recommend a Lattice FPGA. Gowin FPGAs are really, really cheap (Tang Nano 9k), and _partially_ support an OSS toolchain, but it's missing many core components, and I've actually found bugs in the hardware (confirmed by others), so probably stay away.

The DE-10 Nano is very capable, though fairly sought after at this point. It has a decent amount of IO, and can run the MiSTer retrocomputing project, which has the largest collection of OSS cores out there.

----

However, probably before you even buy any hardware, play with simulations. Choose your HDL (Verilog is US focused, VHDL is EU focused) and choose a simulator. For VHDL, you can use GHDL and GTKWave for an entirely OSS setup. For Verilog, I would recommend downloading the closed source Intel ModelSim, but you can also produce very fast, C++ controlled simulations using Verilator.

In any case, start simulating and looking at waveforms. Debugging multiple sets of nets at once is a very interesting experience and is at the core of what you're doing when working with FPGAs (verification is something like 80-90% of the time), so you can learn a lot without any hardware just working on that. If you use Verilator, you can have psuedo LEDs or even a display to draw on.


You gave very helpful and informative advice exactly what I’ve been looking for. I am very grateful to you and your efforts and hopefully others who wish to start with FPGA will find it as useful and perfectly guiding as I did. Thank you very much. You’ve made my day.


Not the parent poster, but my experience may be relevant.

My background is exclusively in software engineering and computer science. I started by reading “Digital Design and Computer Architecture”. There’s new RISC-V edition https://a.co/d/imzGBK5 as well as freely available ARM edition https://dl.acm.org/doi/book/10.5555/2815529. The book starts from Boolean logic and transistor technology and goes all the way to assembly programming with everything in between. Most importantly gives great introduction to HDLs. Next I played with a bunch of hardware projects specifically targeting inexpensive Arty-A7 board to get comfortable with FPGA tooling.

I can attest to the parent saying that this is sufficiently different from software engineering I do at my day job and therefore feels a lot more like hobby. Especially if you also foray into wire-wrap prototyping, PCB design and assembly. Finding and fixing analog "bugs" is so much fun!


Finding out the digital design parts seems like the easy half of the problem.

The non-open-source-tooling (that is what you need for the dev boards I have played with) is what I've had a real struggle with - Vivado and Quartus are so horrendously complicated, that it seems easy to find a project failing at some late stage because you clicked a button wrong two days ago.

Learning these, and the interplay between board specifics and the way any onboard CPU talks to the FPGA mesh and how you interface between the peripherals just seems like a massive explosion of complexity that you seem to have to know in order to know how to learn it. And every board seems to be a horrendous combination of seemingly opaque "IP" blobs.

I've looked even for paid tutorial series, but everything seems to be assuming you've gotten past this stage, or aimed at the "here is how you build up pure stuff in VHDL/Verilog".


>Vivado and Quartus are so horrendously complicated, that it seems easy to find a project failing at some late stage because you clicked a button wrong two days ago.

The (open) industry secret is that nobody actually uses the GUIs for anything other than analyzing build products (viewing floorplans, etc.); everything else is done through tcl scripts and Makefiles. This lets you both avoid the disaster that is the GUI portion of the tools and keep the entire project in a regular text representation (which means you can actually edit everything and check it into version control!).

As an extreme example, I've never even managed to get Lattice Radiant's GUI to even run, the damn thing just crashes almost immediately. It doesn't matter though because the backend tools all still work perfectly. We've just written some tcl scripts which handle everything for us and we called it a day.


> The (open) industry secret is that nobody actually uses the GUIs for anything other than analyzing build products (viewing floorplans, etc.); everything else is done through tcl scripts and Makefiles. This lets you both avoid the disaster that is the GUI portion of the tools and keep the entire project in a regular text representation (which means you can actually edit everything and check it into version control!).

I've sort of worked out this but it seems to be a weird mix of "you need to know what the UI is doing and why in order to avoid the UI". Getting that initial understanding seems a very high barrier.

(version control also seems to be something they manage to break on a whim over toolchain versions - maybe it's one of these hardware things where people just stick with the same toolchain version over the life of the product).


> Next I played with a bunch of hardware projects specifically targeting inexpensive Arty-A7 board to get comfortable with FPGA tooling.

What sorts of hardware projects?


Thanks for your work. It’s on my list sometime to look into porting an open source GBC core to the Pocket (and add a patch or three on top) - I want to play multiworld randomizers on it!


Will a PS1 handheld FPGA possible in the near future?


Just avoid the 'FPGA reimplementation v actual hardware' religious wars and you'll be fine.


I agree strongly with the sentiment of this article, the author missed the fully open source programmable systems like the Lattice ECP5 and 40K FPGAs. There is something magical about using the LiteX "make me an SOC out of this board" system.

As the ACM Digital has gone open access I can recommend this Jan Gray paper, "Hands-on Computer Architecture – Teaching Processor and Integrated Systems Design with FPGAs"[1] There are different opinions on whether or not understanding computer architecture makes you a better developer or not (I tend to think it does) but its a really amazing time to be able to explore these concepts without the need to be at a company or in an University setting.

[1] https://dl.acm.org/doi/pdf/10.1145/1275240.1275262


LiteX is very neat, but at the same time is a massive pain as soon as you move past building an example SoC. I've been spending a lot of time on this recently.


I wanted to learn how to use Ghidra for reverse engineering binaries. The thing that allowed me to really improve with it was using it to analyze old video games that I played as a kid in the 1990s. Finding previously-unknown cheat codes, coming up with improvement patches, and figuring out how they work is very motivating!

I'm happy that these retro hardware projects are working out; I've liked seeing people test out what I've found in Ghidra on real systems.


> I've liked seeing people test out what I've found in Ghidra on real systems.

Do you have any findings to share?


I've been writing up my findings here: https://32bits.substack.com/

So far I've focused on the Sega Saturn because it is notorious for having a difficult-to-understand assembly language (SH-2). That means there are lots of things to discover, because people haven't really looked yet!


Noice


Why is the MiSTer listed as "used to perfectly emulate and/or upscale analog signals?"

MiSTer doesn't just concern itself with analog signals, it simulates the entire system and outputs the original analog signal or can upscale it for digital output on its own. This description could make somebody think it's just a scaler.


Both. MiSTer can output the original signal through the VGA output or through the HDMI digital output (e.g. and using a HDMI to VGA converter), and also it is posible to configure the HDMI output as a scaler (with different modes, from the lowest latency modes using one line as buffer, to the more complex changing the frequency and/or adding low-latency CRT/LCD/etc filters). Effects can be added to the analog output, e.g. scan doubler. Please consider checking the documentation, it is an amazing project.


I am familiar with the project. I’ve owned one for several years, am highly active on the discord and chat with core developers on the regular.

The way it’s phrased in the article linked to makes it sound like it’s “just” a scaler like a Retrotink OSSC. It definitely does fantastic work scaling its own output but it won’t accept an outside analog signal which firmly moves it out of the family of “scaler” to me.


I can see how an FGPA can perfectly emulate the logic of an older chip--but the actual physical layout is different, right? Are there any timing issues that result from this?


The reality is that the vast majority of these FPGA-based clones don't actually perfectly emulate the logic. They're using the same reverse engineering techniques the traditional emulator developers used and sometimes even the same community documentation. The results are often quite good, but they're making a new implementation that matches the observed behavior of the original system to the best of their abilities.

Now there are some exceptions. Nuked MD FPGA[0] is a recent example of an FPGA recreation that is a fairly direct translation of the original logic using silicon die analysis. In this case, the logic is basically identical, but as you guessed the physical layout is different. Generally speaking, you write FPGA "gateware" in a language like Verilog or VHDL. These don't intrinsically have any information about the physical layout of the logic which is handled by the toolchain instead. As wmf says, this is generally not a problem most of the time. For synchronous logic, either the total propagation delay is small enough for a single cycle or it isn't. The toolchain will estimate this delay and report whether you met timing or not for the configured clockspeed.

Not everything you can do in silicon translates well to FPGAs (both clock edges is also generally not well supported for instance), but for the most part these things are easy enough to work around.

[0] https://github.com/nukeykt/Nuked-MD-FPGA


Except for the rare cases where a chip can be decapped, turned into a netlist and painstakingly translated 1:1 into logic primitives (which itself is usually impossible without some fudging), all re-implementations are exactly that.

You can still do higher level stuff in an FPGA. Maybe you don't actually care how the sprite hardware really works, and you just make your own that mostly works the same. Maybe you don't even care that a PPU is split into 3 chips, you make yours without regard for the physical delineation of the original. There are some cores out there like this - written off of software emulators with minimal original research. It is many times possible to get something that somehow plays games even while being "inaccurate", but yields little benefit over software emulation besides lower latency from controller inputs.

Higher-quality cores always involve original research. It is rare that documentation already exists at the detailed level you need. The best people in the field blackbox the original chips and, based on years of experience and knowledge of sussing out behavior by thinking like the original chips designers, can make a functionally and timing accurate model that can operate in lockstep with the real chip, cycle for cycle the same data on the bus. This is the sweet spot for FPGA implementations, but also requires a lot of skill and expertise.

At the extreme end is stuff like NukeMD and the visual 6502/68k projects. The logic is cloned at the gate level without any guessing. Still, some changes are necessary. For example, chips with internal tristate busses are impossible to do on FPGA fabric. Clocking in both sides of the clock. Using multiple phase clocks. Using dynamic logic. And so on. These implementations are usually much less space-efficient than the paragraph above, but offer the highest accuracy.


An FPGA can enable these things but it doesn't magically happen. A MisterFPGA can emulate a PC with the ao486 core, which is an achievement, but the ao486 core doesn't precisely mimic the machinery or timing of the 80486 or any other CPU.


In normal synchronous logic if something takes N cycles it takes N cycles; sub-cycle timing differences don't matter. If the original chip is doing weird asynchronous stuff it could be hard to properly emulate.


I'm under the impression that there were test and debug ROMs for each console that stretch capabilities and can reveal non-ideal behavior. They now work as unit tests for emulators.


What can a hobbyist do with an FPGA? Ive mostly only heard about fpga's being used at HFT firms somehow.


Design and implement basically any digital circuit you can possibly imagine, within certain speed, thermal, and/or gate count constraints.


You can do any combinational and sequential logic you want without having to use a forest of discrete 74xx ICs. If you've ever scratched an itch with redstone in minecraft or taken a digital logic course you might find it quite rewarding to interpret a problem into a binary problem then implement an optimal solution with k maps and implication tables.

Practically speaking they're useful for hobbyists who want to push beyond the capabilities of the I/O and crunching capabilities of microcontrollers. In many cases the 32-bit micros around today are good enough, but I think it's satisfying to work with bare logic elements.


If you have a couple of years and a friend, it's possible to build an entire system, from designing the architecture through writing the apps to run on it: http://www.projectoberon.net

(they provide Verilog on this website, but Wirth himself has an HDL of —of course— his own design: https://people.inf.ethz.ch/wirth/Lola/index.html )

NB. Risc5 != RiscV


My undergrad computer architecture class we built a processor over the semester with an FPGA. So writing all of the different math circuits and a logic circuit and then combining them together.


I’m getting the sense that there was an FPGA “ground floor” at some point in the last 5-7 years, and trying to buy one now is pretty much a ripoff.


Pretty much, the DE0 which most of the retro fpga computers rely on, was cheap, but it is almost impossible to find at decent prices anymore, even the non-mainstream FPGA boards (WaveShare, etc) have jumped in price massively, even though they're really pretty useless for retro computing use unless you're willing to reinvent the entire universe to bake an apple pie.


> willing to reinvent the entire universe to bake an apple pie.

Hey if I’m going to do something I’m going to do it right.


Regarding MiSTer :

Whether or not the 2023 entry price of a few hundred US dollars plus shipping (that is, a $225 DE10Nano board + $65 RAM module, with basic analog outs and USB able to be added via ~$20 generic dongles) is a ripoff or not, for a system that uses FPGA tech to simulate/emulate a large range of old computers and consoles to a leading standard, including a long (ever-growing) and impressive feature list of options, and a very stable longterm front-end ...is quite subjective.

I think it's still exceedingly good value. But certainly not the only, or outright cheapest, option.


That's odd. Whatever you could get 5-7 years ago is still available (except better) for pretty much the same or lower prices and with a lot more examples to start from.

How is that world a ripoff?


If you're willing to do some grunt work, there's still cheap boards to be found on alibaba and the like.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: