Hacker News new | past | comments | ask | show | jobs | submit login
Build an 8-bit home computer with 5 chips (ieee.org)
284 points by lelf on March 21, 2020 | hide | past | favorite | 94 comments



Hi HN, project author here. The source code and design files are on GitHub: https://github.com/74hc595/Amethyst

Documentation is lacking, sadly. My intent was to create a series of in-depth videos describing how various aspects of the system work in detail, but I'm somewhat behind on that.

For folks that say "using an ATMEGA is cheating":... I agree! That was a guiding principle behind my 6809 computer project (Ultim809)--use only period hardware and no programmable chips (apart from EPROM). Amethyst originated a bit differently. Initially it started out as a neat hack to get 8-bit color video from a non-overclocked AVR without expensive external chips. And then I discovered the mechanical keyboard community and was able to source keyswitches and keycaps, and, well, in for a penny, in for a pound.


Matt, may I ask how you made the documentation? It's wonderful - its appearance is so in-keeping with the whole project. The Ramforth Reference Manual cover page says 'Typeset on a Smith-Corona Coronet XL'. Did you use a word procssing program or text editor, if so, running on what? How do you keep the Coronet XL running in good repair?


Why not take some of the discrete components out and introduce another cpu to handle both video and sound? it would make the design much simpler and programming easier overall.

Also, why not use a more capable cpu with more embedded ram? These things are very cheap nowadays.


Rather than another CPU, it would be a AV interface chip wouldn't it?


Matt, you're rockin'. Saw a link to this on YT via /r/mechanicalkeyboards

I like the staggered key arrangement of your prototype better, but whatever.

Your projects always inspire me.


BTW, what is the Amethyst's story regarding modern displays? Most of us have tossed our CRTs.


Works fine with the (now vestigial) composite input on modern TVs. I’ve also usd an upscaler called the RetroTINK 2X to convert composite to HDMI, and it has superb picture quality.


For this project, how are the 4KB of EEPROM used differently than the 16KB of application memory?


And for those wondering how much power resides in an 8 bit 20 MHZ AVR with 3 or 4 colors and sound over RCA checkout Craft by IFT:

https://youtu.be/sNCqrylNY-0

It is as good as domain specific mastery gets


Try to reproduce that with a modern stack, and you'd need hundreds of megabytes of browser engines, npm libraries, etc. and a GHz clock speed.


Embedded software development is still alive and well. No js required.


Sure. I said with a modern stack, like what most webs*s program in.


I recently bought Ben Eater's 6502 kit. One of the things I've been meaning to do is figure out how to get it to talk to a serial port for terminal access.

A while ago on YouTube, I watched a series where someone was restoring their IMSAI 8080 (the machine from WarGames) and learned it and the original Altair were a lot more capable than their front panels made them seem due to the S-100 expansion slots inside that sit on the bus. The first card the video showed was a terminal connection so he could send/receive commands using a VT-100 terminal (an emulator on a normal PC in this case). The cool thing I learned is that the way this worked is that the terminal board on the IMSAI would simply send the same screen info over and over again across the serial port, and it was up to the terminal to display this data on screen (making terminals like little dedicated computers themselves). I don't know how I thought this worked, but it was cool to see him debugging it.

Seems a lot easier than trying to get a direct-connected display up and running and having to figure out all the signals and colors, etc. That said, I haven't actually done this yet, so the devil may be in the details.

1. https://eater.net

2. https://youtu.be/d_mWREmze5o


A friend of mine, an 8-bit enthusiast, calls usage of ATMEGA chips "cheating" :) "True" 8bitter relies solely on component base of the 80s.


Saw this a few times on HN, but immediately thought of it and feel its an appropriate response to your friend.

"I thought using loops was cheating, so I programmed my own using samples. I then thought using samples was cheating, so I recorded real drums. I then thought that programming it was cheating, so I learned to play drums for real. I then thought using bought drums was cheating, so I learned to make my own. I then thought using premade skins was cheating, so I killed a goat and skinned it. I then thought that that was cheating too, so I grew my own goat from a baby goat. I also think that is cheating, but I’m not sure where to go from here. I haven’t made any music lately, what with the goat farming and all."


While clever, I’m not a fan of this argument as it ignores the fact that each of steps are different things requiring different skills. Putting samples together is more arranging music, while creating the notes is composing. After that, you have sound design, playing an instrument, making an instrument yourself...

Sure none of these are “cheating”, but someone somewhere has to do each of those things, and the further down the chain you go, the more “control” you get over your sound and composition. The law of diminishing returns of course hits at some point (although someone may argue that their breed of goats has a certain sound they can’t get any other way).

It’s the same thing in programming: someone chaining together libraries may eventually run into a point where there’s nothing out there that does exactly what they need. That doesn’t mean it’s not your work unless you’ve written the compiler yourself, or have your own fab in your garage, it just means you have to be aware of the degree of control you give up the higher level you go.


It doesn't sound like you are in disagreement, the parent story is just a flowery expression of the diminishing returns (control), the further you go.

As you say, the reality of it is that there are thresholds, for some where the benefit cost ratio is poor enough few will break through it from a higher level use... and then there are lower ratio thresholds in between where you will get various proportions of experienced people who want a little more control (in different directions) breaking through.

But even with those thresholds (in this case one IC vs another IC), it's arbitrary and subjective, you are just choosing to spend your time and effort in a different way.


> Putting samples together is more arranging music, while creating the notes is composing.

Really depends on what's the length of samples and how exactly are you working with them. With whole musical phrases (as used in 90s hip-hop and french house, for example), it's really more like arranging. But when you cut those very samples just a little bit shorter, and start playing MPC pads like an instrument, I'd argue you switch back to composing.


Love that description.

In many ways watching people describe and choose their spot in the "grab package" <-> "herd goats" spectrum is my favourite part of AoC each year¹. The squirming some people choose to do when justifying their place in a table of magic internet points is a lot of fun.

Edit: Should add I'm one of those squirmers too, often when I'm thinking about networkx/numpy/etc.

1. https://adventofcode.com/


I have found satisfaction at least for now in going in the other direction - what can I do with a locked down computer that has absolutely no software installation authorized and only has the standard software for any non-technical employee. In other words, everything that can't be done in that environment is now "herding goats" to me. Sometimes I am tempted to try to get developer-type privileges, but I've resisted so far.


Is there an endgame beyond empathy for "normal" users? I'm curious about whether you're doing this to learn more or perhaps with the intent on spotting opportunities to make things better or some other reason entirely.

I can personally see the discussion of how to work with a basic installation being worthwhile, as I know I'm guilty of "why don't you just $bunch_of_experience_option?". However, I don't think I'd want to try to do actual work without the tools I have and the tools I make.


For an auto metaphor, suppose you were really good at building race cars, and then you set out to make an entry for the 24 hours of LeMons.[1] Some people do that. I never have been a blank sheet of paper/greenfield sort of person and I always lose interest in computer games if I have unlimited resources.

[1] https://en.wikipedia.org/wiki/24_Hours_of_LeMons


I see what you are talking about, but your parable has another side that no one sees - the guy learned a lot of different things in the process :) For people who enjoy learning and exploration it is the true meaning of the whole thing, not a written tune.


> your parable has another side that no one sees - the guy learned a lot of different things in the process

No one sees?! What do you mean? That’s literally the punch line of the joke.

Of course ‘learning things’ is a secondary benefit, especially if that’s actually one of the goals you specifically set out for. Still I’ve personally watched programmers live that joke, and overengineer something that could take a day into a year long project, to solve a problem they didn’t have. I’ve seen it enough and cause enough problems that I try hard to write code with specificity and stick to the problem at hand. So much so that I have an actual problem with not abstracting things soon enough. ;)


Well, I just thought you are giving a negative connotation to it while I think there's nothing wrong if the thing is your hobby :)


Beautiful. The law of diminishing returns means that the perfect is the enemy of the good (or something like that.)


The lesson is that when it comes to personal pursuits, there are no bright lines between doing something “properly” or “cheating”. It truly is all in the eye of the beholder.


If your goal is to reduce the problem domain from hardware and software to just software, then yeah, using a chip with most of the hardware work done for you is fine.

Looking at Ben Eater's youtube channel, there seems to be a healthy demand for building more authentic '80s 8-bit systems.

If one's goal is to learn about how to build an 8-bit computer, using a microcontroller is not going to be an edifying experience unless the goal is to do something mostly in the software domain like write a simple OS.

While I think your parable is good advice for getting a business going quickly, it's not appropriate for deep dive learning or hacking.


Yes, it's silly to be gate keeping like this, with some purist fantasy. If OP friend like to role play as a engineer from 1981, that is cool. But he has to realize that that is not everybody's goal.


Personally I don't care that it's a modern "90s" chip.

But the fact that it's a microcontroller rather than a microprocssor bugs me. The lack of an external memory bus feels incredibly in-authentic to me.


>> The lack of an external memory bus feels incredibly in-authentic to me.

Yep. It's still a neat project, but don't claim to build an 80's computer using "only 5 chips" when they're more modern chips that allow that low count. A single FPGA can do it all too.

On a tangent, I'm curious about upgrades to old machines that could have been done at the time. For example, my Interact has super low-res graphics where each pixel is 3 scan lines high. That machine was designed for 4-8K of ram but shipped with 16K and had an upgrade to 32K. It seems like increasing the vertical resolution should have been a fairly trivial hardware hack since the RAM is all there. Bumping the horizontal resolution might have been possible but harder. Increasing the available colors should have been fairly easy. It seems to be a case of not enough time for the design to bake. The other 8bit machines with ASICs might not be so incomplete, but there may still be some things that could have been done.


A huge amount of the limitations were "just" down to cost.

E.g. on the C64 the CPU and graphics chip competes for memory cycles all the time. This is why the screen blanks when loading, for example - to prevent the graphics chip from "stealing" cycles. A more expensive memory subsystem would allow working around that, and speed up the entire thing. This is a recurring issue with many other architectures as well for cost saving reasons (e.g. the Amiga "chipram" vs "fastram" distinction).

From 1983, the 8/16-bit 65C816 (used in the Apple IIGS) would have been a more natural choice for the C128 or even a "revised" C64 (at the cost of some compatibility), and reached clock rates up to 14MHz.

A lot of it was also down to market pressure and R&D costs... All of the home computer manufacturers led precarious existences, as evidenced by most of them failing or transitioning out of that market (and often then failing); at its peak, Commodore was notorious for being tight fisted and spending ridiculously little money on R&D relative to its size, and that was probably what made it survive as long as it did despite serious management failures, while Apple largely survived by already then going after a less cost-sensitive market segment and much higher margins.

The list of R&D efforts in Commodore that were shut down not because they were not technically viable, but because the company wouldn't spend money on them (or couldn't afford to) is miles long. I'm sure the same was true in many of the other companies of the era (but I was a C64 and Amiga user, and so it's mostly Commodore I've read up on..)

We could certainly have had far more capable machines years earlier if a handful of these companies had more money to complete more of these projects and/or if there had been demand for more expensive machines at the time.

But then progress is very often limited by resource availability, not capability.


Many of the ATMEL 8 bit chips do have an external memory bus. Not used in this case, but it's possible.

Sometimes it's pretty blurry what is a microcontroller and what is a microprocessor. The PIC32MZ/DA has an integrated Graphics Controller and 32MB of DRAM.


There’s a BSD 2.11 port for PIC32:

http://retrobsd.org/wiki/doku.php/start


The AVR can't execute programs from the data bus, which means no execution from RAM or external memory, though you could write a virtual machine that could do so.


http://geoffg.net/maximite.html

That’s a cool PIC32 home computer - PIC32 is MIPS btw far beyond the old 8 bits though...


Steve Wozniak himself was cheating. Instead of using proper D/A converters that costed 100 times more, he made a hack.

And the hack was good enough.

It is one of the things in which academical training is perverse. It trains you to always complicate more and more any subject, and that the hard work is the goal, shortcuts are not permitted.


I agree. But, you know, enthusiasts enjoy holywars. And there's a thin hazy line between a "true" and a "cheater".


What was the hack? PWM to an RC filter?


I believe it's the one mentioned in the article about how to achieve a color NTSC signal by just outputting ones and zeroes with proper timing.

It begs the question what was done in parts of the world that didn't use NTSC, though.


the other video standards (Europe used PAL) aren't much different in principle than NTSC, just some different timing and encoding parameters, the same D to A techniques work. PAL is higher bandwidth (and higher quality) than NTSC and Woz's timings were on the edge for NTSC so PAL versions didn't do color till a little later.


You could do a lot with period-correct SoCs like the 8052.

https://archive.org/details/BYTE_Vol_10-08_1985-08_The_Amiga...


A lot of 8 bit designs used designed ASICs that weren't a generally available part, like the Atari 2600 with the TIA, the Commodore 64 with the PLA, etc. Using microcontrollers for these seems fair.


The PLA wasn't an ASIC but you could call out the VIC 2 instead.


Meh. The spirit of a self contained unit (display output, keyboard input, etc) and boot-to-interpreter are there. The Maximite family of boards "cheat" even more with a PIC32. But they are popular, and have the same feel as the 80's 8 bit PCs.


Yes, but it's not about the feel of the product which is definitely there, but about the skills required to design. I, personally, would never call anybody who designed his own computer whether with ATMEGA or not, a "cheater". It deserves respect, to say the least.


ZX81, from 1981, used 4 chips.


One of which was custom designed (an ULA) for the ZX81.

The ZX80 used 21 (IIRC) off-the-shelf TTL chips.


Well, his argument is not about the number of chips but rather component origins. If you are about to pretend that you are in the 80s and you develop a computer in that time, you've got to use what was available. I argue that ZX Spectrum used custom-made ULA and you kind of emulate that approach to design with ATMEGA and he just laughs :)

The truth is that we live in magnificent times where there's a wide range of choice on how to do things and it is amazing.


That’s not much of a constraint—the https://en.m.wikipedia.org/wiki/Hudson_Soft_HuC6280 was an 8-bit CPU produced in 1987, and yet was powerful enough (in combination with the PC Engine’s 16-bit PPU) to drive very rich experiences.


I feel like it depends on the goal.

If your goal is faithful retrocomputing, then sure, an ATMEGA is out of the question v., say, a 6502 or Z80.

If your goal is minimalism or simplicity, then an ATMEGA is perfectly viable as a way to get there.


The display controller chip seems to be a hard to source part if you want to stick to 80s components. Do you know what his recommendation would be for that piece?


Yes, that’s been a frustration for me. I’ve settled on the Motorola 6847 which requires a couple supporting chips. This is scavenging parts off eBay though, which seems to be a plentiful source. I’ll find out shortly whether any of them are actually good.


It looks like it would take about 20 chips. [1]

[1] - https://eater.net/vga


Not an unreasonable choice, but it has to be noted that home computers of the era already used LSI/VLSI dedicated ICs for this purpose, condensing the functionality of these 20 chips into a single package.


I can do this in one Chip, an ARM emulating the whole thing or an FPGA version.


Very cool project! I'm not sure exactly why, but the idea of having a self contained computer that you plug into any display still seems appealing to me, even though I'm typing this on a convertible laptop that actually brings its own display...


Love the write up. OSHPark PCBs - wondering how many revisions it took to get that keyboard layout working! I'd love an 8bit era keyboard and case that you can plug your own PCB into - I'm itching to design something from scratch but I know what will happen if I need to do a full cycle design process - eventual apathy :(


Direct link to GitHub project: https://github.com/74hc595/Amethyst


First of all, utterly brilliant! An effort to be lauded in every possible way!

That being said, I think there might be better ways to do the video...

Excerpt:

"So I looked in my drawers and pulled out four 7400 chips—two multiplexers and two parallel-to-serial shift registers. I could set eight pins of the 1284P in parallel and send them simultaneously to the multiplexers and shift registers, which would convert them into a high-speed serial bitstream. In this way I can generate bits fast enough to produce some 215 distinct colors on screen. The cost is that keeping up with the video scan line absorbs a lot of computing capacity: Only about 25 percent of the CPU’s time is available for other tasks."

You know, you could have this computer communicate with a Raspberri Pi over USB, and have the Raspberry Pi act as the computer's "video card". Doing this and only sending parts of the screen that changed (like VNC) might allow many more colors at much higher resolutions, and might not tax the CPU as hard (especially at points in time when the screen doesn't need to be updated)... you know, use the Raspberry Pi as a frame buffer of sorts...

But, that being said, I once again reiterate with respect to the work: Utterly brilliant! An effort to be lauded in every possible way!


If using an ATMEGA is considered "cheating" as some in this thread have said, then using a RaspPi as the video chip is most definitely so.


That might be true...

Not to change the subject, but I just got the following idea:

How about a distributed software infrastructure where you can use any computer as the CPU/RAM, any other computer as the video card, any other computer as the keyboard, any other computer as the network card, etc.

Basically, outsource functionalities to different computers, via something like USB, or other bus/communications technology...

This might be a little bit off-topic (apologies for this), but I needed to get this idea written down, while I still had it...


That sounds like distributed hardware, not software. And I get the feeling that it's probably been done somewhere before, even if we've never heard of it. Definitely would be neat to figure out such a system!

Certainly the network interface as a separate computer has been done, though. Just look at how the original ARPANET worked for that.


The problem here is, it's hard to get decent video output without a larger set of complex chips. If you just want serial output, this is totally doable with period accurate chips.

There are quite a few Z80 implementations that only use a handful of chips.



I wish the spectrum next would get a wider release. I know that the raspberry pi was supposed to be the hobby computer of our age, but I just really like the simplicity combined with all of the paint point removal of the 8bit age.


Every keyboard (or at least an AVR-based DIY open-source mechanical keyboard) is an 8-bit computer. Toyed many times with the idea of adding a display to a keyboard I assembled.


Can you shave a chip or two off the count?

MCU with integrated USB? eg. A SAML21 chip. It also has several PWM timers, CCL and DMA, I wonder if that could help with video generation?


That's a really odd keyboard layout: querty that omits the spacebar and puts space under backspace.


> It uses just six integrated circuits—a CPU, USB interface, and four 7400 chips

Wonder how the source article came up with the title “just 5 chips”, yet the article throughout mentions six. Not a huge deal, but wonder if typo or some other reason?


If i understood correctly, the USB interface is optional, as printed on the pcb.


> Consequently, I needed a lightweight programming environment for users, which led me to choose Forth over the traditional Basic.

Excellent! I’ve always maintained that BASIC is a bad language for memory- and CPU-constrained systems.


This looks really neat. What homebrew computer kits do all of you recommend for someone that has never done this before? I would like to build a computer and program on it. I've done web dev only.


I have a bunch of atmega328p microcontrollers hanging around from a previous project. I'm so tempted to try to make some kind of bit-sliced CPU out of them...


I've always wanted to build a system from discrete parts, but that whole "thousands and thousands of transistors" thing changes my mind every time :D


Does anyone know how many chips the Sinclair ZX-81 had?


5, but ZX81 had (semi-)custom logic chip so its not that directly comparable


Four - CPU, RAM, ROM & ULA.


If modern harware is going to be used, wouldn't a FPGA be a good way to replace the TTL devices thus lowering chip count?


I wonder, How much can these 8bits computers be pushed in terms of overclocking the cpu / replaced with a new cpu?


You can get a 20MHz Z80 that will overclock up to 33MHz or so. A Z8S180 runs at 33MHz and includes timers, uarts, dma, and an MMU. I don't know how much you could overclock one of them, but it's already pretty fast for an 8-bit chip.

Zilog also makes the ez80, which is like the Z180 but runs at 50MHz and has a 3 stage pipeline, and it has a 24-bit mode.

On the 6502 side, there's the 65C816S which is the fully-static CMOS version of the 65816, which is a 6502-compatible CPU with a 16 bit mode and 24-bit addressing. The Apple IIgs had one, but its clock speed was limited to 2.8MHz so as not to compete with the Macintosh. Today you could build a machine with a 14MHz 65C816S and fast SRAM, and have a pretty beastly 6502-compatible machine.


Depends on your definition.

If you don't care about socket compatibility you could simply load an 8-bit CPU in a fast FPGA. It will probably not be hard to run one at 100MHz or so.


Just to confirm...128MHz Z80 via an FPGA. https://sowerbutts.com/socz80/


Thanks. I meant in terms in desicated parts but FPGA might do it. I think we need some starter computers for kids and 8 bit seems quite a good choice. A lot of us started with them and they were simple enough so we could program them ourselves.


There's a big community built around DIY Z80 systems, if you're interested in that sort of thing.

Grant Searle's breadboard CP/M machine: http://searle.x10host.com/cpm/index.html

Spencer Owen's RC2014 modular bus-based Z80 system: https://rc2014.co.uk/ Highly active RC2014 discussion group: https://groups.google.com/forum/#!forum/rc2014-z80

Phillip Stevens' beautiful YAZ180 Z180-based SBC: https://github.com/feilipu/yaz180

z88dk, a Z80 C cross compiler with libraries and all sorts of useful things, which supports many machines: https://www.z88dk.org/forum/

Jon Langseth's LiNC80: http://linc.no/products/linc80-sbc1/

Steve Cousins' many different machines, some RC2014-compatible, some z50bus (LiNC80) compatible, some Z180 stuff, SBCs, a powerful "small computer monitor", etc: https://smallcomputercentral.wordpress.com/projects/

Alan Cox's Fuzix, a unix for 8-bit machines (including many of the aforementioned Z80 machines): https://github.com/EtchedPixels/FUZIX

Most of the folks mentioned can be found on the retro-comp discussion group, which was formed from RC2014 group members to discuss other projects: https://groups.google.com/forum/#!forum/retro-comp


People have run Hitachi 68C09Es at 5 MHz. May not sound like much, but the 6809 (which the 6309 is an improved version of) takes fewer clock cycles than Z-80 or 808x take to do similar work. OpenCores has a 6809 core, not cycle accurate, that can run at 40 MHz if I understand the description rightly (https://opencores.org/projects/6809_6309_compatible_core).


can't talk about 8bit minimalistics homebrew computers without mentioning Uzebox:

http://uzebox.org/wiki/Main_Page

it's a great platform, super easy to develop, tons of fun. Highly recommended.


Website completely broken for EU mobile traffic. GDPR popup hidden under another popup (yay 2020) and page cannot be scrolled or viewed.


I’m not seeing that (Firefox anti-tracking?) but it‘s trivial to hit the reader mode button in the address bar: it’s a better experience anyway.


If you wait long enough, the obstructing popup goes away ;-)

One of the rare occasions where I endured such pain because I was so interested.


The way the website is setting the title is weird. On my Debian XFCE4 toolbar, the title is showing in two lines.

https://imgur.com/ADhL9Aj.png


Looks ok for me, with adblock+umatrix: https://i.imgur.com/UABzai9.png


> GDPR popup

I will die on that hill but that "GDPR popup" is actually a pre-GDPR, obsolete cookie law popup that is not compliant with GDPR anymore (GDPR requires ability to deny cookies) and should be removed, as it amounts to nothing and is a pretty much a dark pattern. (I upvoted you btw, because I agree 100%)


Also completely broken from (I think) US residential IP addresses (wget'ed html does not contain page text at all).




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: