Hacker News new | past | comments | ask | show | jobs | submit login
How early 8-bit arcade machines differed in design from 8-bit home computers (floooh.github.io)
178 points by pmarin on Oct 9, 2018 | hide | past | favorite | 51 comments



Great article. Wish he named some of those "obscure Eastern European computers". I know there are good communities around the Japanese microcomputers, the Sharp X1, the 8801, the 9801, MSX, MSX2, etc. http://fullmotionvideo.free.fr/phpBB3/index.php?sid=88c8ee18...

There's some excellent games buried away on them. This series of books in particular is amazing:

https://www.amazon.com/Untold-History-Japanese-Game-Develope...

Anyone know what the author is referring to?


You can check out some of the East German 8-bit computers and games for them on my tiny emulator page https://floooh.github.io/tiny8bit/:

- KC85/2 to /4 these were 'original' designs, even though some ideas (like the pixel/color attribute RAMs) were inspired by the ZX Spectrum. A slower Z80 CPU (1.75 MHz) but higher pixel and color resolution (e.g. a 320x256 display, compared to the Spectrum's 256x192)

- Z1013 was a most minimalistic Z80 home computer sold as a assemble-yourself-kit (the only East German computer that was really available and affordable for 'normal people')

- the Z9001 (aka KC85/1, aka KC87) was somewhere inbetween the Z1013 and KC85/2 models

- the KC Compact was an East German Amstrad CPC clone, but without custom chip (the custom gate array was 'emulated' with standard chips)

The Soviet Union built a very interesting PDP-11 compatible 16 bit home computer (https://en.wikipedia.org/wiki/Electronika_BK), and impressive demos are still written for it: https://www.youtube.com/watch?v=u_pdp1QSp70.

There were also surprising number of ZX Spectrum clones built in Eastern Europe because they didn't require custom chips, some of those designs also improved on the original Spectrum design (for instance a higher color resolution)

Bulgaria built an Apple II clone (and also built their own MOS 6502 and Motorola 6800 clone chips): https://en.wikipedia.org/wiki/Pravetz_computers

...and there's probably at least a dozen more computer models I have never heard of :)


Ah, that's awesome! thanks!


> And then for the rest of the year spend all my after-school time trying to build rather poor impressions of those arcade games on my home computer, not unlike a cargo-cult worshipper on a post-WWII pacific island trying to build a US military radio station from wooden sticks.

Hahaha, this matches my own experience exactly :) Playing the likes of Dragon Ninja / Bad Dudes and then trying to reproduce them on the Spectrum. The few arcade games that did have Spectrum versions were disappointing - thinking of Green Beret for example.

Best memory might be finding a POKE for infinite lives in Commando, playing it for 3 days straight until level 70-something (the levels just looped every 5 or 10 stages, I think?), and ending up with a burned CRT. I'll never know whether the infinite lives POKE also made the game loop, or the developers never thought of including an ending in the Spectrum port.


It was a little rare to have an "end" to the games in those days. Generally there wasn't much space for something so extravagant. More often you would overflow a value somewhere and semi-crash the game.


Unless you finally got to the end of Karateka and you walked to the princess....


Commando on the C64 didn’t have an ending. It just continued. I know since once you got good enough you could complete the stages effortlessly. My longest play was a couple of hours straight - enough to restart the highscore counter from zero again at least once. We stopped playing not due to running out of lives, but deciding to end playing.

Other games would get faster and faster, eventuallt ending up impossibly difficult.


What's wrong with the Spectrum's Green Beret? And R-Type Spectrum conversion was very good.


Nothing wrong with it, I spent endless hours playing it. But I saw the arcade version only afterwards, and I went "oooooh, colours!"

The "native" Spectrum games were better IMO. Sir Fred comes to mind. Pentagram, Gunfright, Knight's Lore, Abu Simbel Profanation. These worked with the limitations of the hardware (most notably the colour clash), not against it.

I mean, there even was a Spectrum version of Operation Wolf, and it was kind of atrocious (but still unbelievable that they made it work on such modest hardware!)


Funny you mentioned Operation Wolf, this is a game I really like and still play in the emulator from time to time.


Definitely not the same without the light gun :(


It didn't magically make the Spectrum have more colors or better sound, like the arcade game had.


Maybe but I did "waste" countless hours with it.


He made me smile when he says: "there’s very little information for Bomb Jack on the web"

hardware schematics, dumped rom, source of a working emulator... What do you want more ? :)


Compared to the wealth of information and research that's available for popular machines like the CPC, ZX or C64 that's indeed not much :) But true, since Bomb Jack is only built from well-known standard parts, the schematics is all that's needed (although there are some minor problems in it, like missing or faulty labels).


The piece is at least as much about how emulator design (can) differ between emulators for arcade machines and emulators for 8-bit home computers.

Home computer emulators need to be able to handle all the software you can throw at it, while an arcade emulator need to be able to handle on the single game ROM which ran on it. It doesn't need to handle subtle behavior of the hardware which never manifests in the game output.

(Of course, the situation is different for arcade machines which could take different ROMs.)


Not to mention that home computers had a different set of devices to handle as well - eg. cassette drives, floppy disks and later small hard disks - often with their own controllers which had to be interfaced with the system somehow (using isolated I/O on CPU's like the Z80, and memory-mapped I/O on others like the 68000).


I will forever wonder why 8-bit computers of the 80s didn't have more powerful video hardware. It's not that the technology wasn't available, since most arcade machines at the time had a lot more powerful graphics hardware than 8-bit home machines.

(I will also forever wonder why Sega's 16-bit console didn't have hardware sprite scaling and rotation, since that was the main attraction of Sega games at the time...)

I don't think it was only economic reasons, because I really doubt it would be that much more expensive to have a few tiled background layers, a few more sprites on the screen and a few more colors in a decent (i.e. not 16o pixels wide) resolution...


Well it almost certainly was cost. Memory was expensive. Address space was limited to 64k. And most importantly, user's displays were color TVs with terrible resolution. Arcade games shipped w/ high quality CRTs. Back in the 80s very few of us had that luxury. My display on my VIC-20 was a 9" B&W TV that my parents had lying around.

Also if you look at something like the Atari 800, it had at least a subset of what you're talking about as far back as 1979. As a result it was more expensive than the competitor machines, and also many software developers didn't make good use of the features that were there.


Our TVs where fine for the resolution of, for example, ZX Spectrum. C64 and Atari and Amstrad CPC graphics looked extremely blockly on our televisions/dedicated monitors (well the Amstrad had such a beast).

I had connected all the above (spectrum, c64, atari) first to a 40 inch b/w tv, then to a 32 inch color tv. Guess what? the spectrum image was visibly blockly, and the c64/atari image was incredibly blocky.

Furthermore, you didn't need another 64k for video ram; there where a lot of palette tricks that could be done. Even if 64k ram were added, the price difference would be small in the end.

So I don't buy it it was the cost; most probably it was the shortsightedness of the designers.


It would have been quite a lot more expensive, I assume? Roughly speaking, with 2MHz RAM, and assuming a setup where the CPU and the display system alternate (so 1,000,000 bytes/second bandwidth for each), you have a budget of 40 bytes for the visible part of the line, and large buffers are expensive, so it's just a question of how you spend it. Take the C64 as an example: you've got 320 px mono (40 bytes/line), 160 px 2bpp (40 bytes/line), or 2 40-byte character modes (40 bytes/line). I assume the fetches for the 8 sprites fit in the 24 bytes that are left, as each is 3 bytes wide.

Later machines had more bandwidth and more complicated memory systems, but the limitations are still the same. You have a fixed number of bytes you can fetch from RAM per second, the TV scans out at a certain rate, and that's going to limit what you can do.


Arcade games could also more readily use ROM, so tile and sprite data could just flow in parallel to the rendering subsystem -- without worrying about sharing with the CPU or refresh cycles.


That's assuming you're reading all the layers from the same ram. You could increase memory bandwidth by using separate chips for each layer.


Memory usually accounted for a majority of the cost in those days, so there's a lot to be said for a unified memory system in home computers. The standard memory architecture for home computers from 8-bits like C64 to 16-bits like Amiga was unified memory with half the memory cycles each going to the CPU and graphics, and PCM playback on later systems could conveniently steal a few graphics memory cycles during hblank. (The C64's VIC-II also stole CPU memory cycles every 8 scanlines and could be forced into stealing a lot more bandwidth with the 'badlines' technique.) ROM-based arcade machines and consoles generally had memory parallelism for their graphics systems. It puts heavy pressure on the chip pin-out and it requires you to physically allocate the RAM/ROM chips onto different buses. I guess theoretically you could put all the memory chips on multiple buses with a configurable arbiter in front of each so you could change the allocation per application, but the PCB for that would be massive and expensive. Instead there was usually a slow, awkward way to access memory from a different bus, e.g. the graphics bus could be accessed by the CPU only with DMA or a pair of memory-mapped addr/data registers with multi-cycle wait states. So, this kind of architecture isn't really suitable for a general-purpose home computer unless you can afford a large allowance of memory for each bus, but it did exist on high-end home computers like the X68000 that were less cost constrained. Also, Amiga did have the option for a kind of RAM expansion called Fast RAM which was only accessible to the CPU, unlike normal Chip RAM which was accessible on the standard chipset bus to the audio and graphics chips in addition to the CPU. Fast RAM didn't speed up the graphics system but it prevented the CPU from stalling when the other chips wanted to steal CPU memory cycles on the chipset bus (they always take precedence since they have hard real-time constraints).


Yes, and the C64's colour RAM is a good example, because accesses to it come for free on top of the 64 bytes/line from the main RAM. But more chips is exactly the sort of thing that would make it more expensive. And I assume it would also have been more complex to build, or something - because if it were cheap and easy, surely people would have done it?


64 bytes per scan line X 200 scan lines = 12,800 bytes.

1,000,000 / 12,800 = 78.125

i.e. frame rate = 78 frames per second could be updated with that speed.

I am sure 78 frames per second are not really desirable, so the resolution could actually be doubled per scan line:

128 bytes per scan line X 200 scan lines = 25,600

1,000,000 / 25,600 = 39 frames per second.


It's my understanding, that the reason why arcade boards were so much more powerful was because they were a lot more complex thus much more expensive to design, source components and produce in large scale.

More colors, background layers and sprites with slow CPUs and graphics chips in the 80s usually meant having several RAM chips to have multiple memory buses and parallel accesses to video memory and more video memory in KB. That's why arcade boards were usually very large and with a lot more components when compared to home computers and home video game consoles.

Sprite scaling was added to Sega's 16-bit console with the Mega CD, it probably wasn't included with the original console so Sega could beat the competition and launch it at a competitive price in 1988 before Nintendo could launch its 16-bit console.


All of this, plus an arcade machine was specialised. BomberMan needs 39 sprites?[1] Build the hardware to do that! Super Racer needs 3D polygonal graphics? Add a hardware scanline rasterizer. To put all these tricks into a home computer, though, would have made it cost many times as much as an arcade cabinet.

[1] Not real numbers.


The hardware budget for an arcade machine could be higher because the economics are to sell a relatively small number of units at relatively high unit cost. On the other hand the hardware development budget was more limited than for consumer home computers or consoles, as that is spread over only the small number of units.

The effect is arcade machines almost never had custom chips, but often had duck-taped-together designs that threw in extra chips to accomplish tasks.

Contrast to a machine like the Commodore 64 or Nintendo which had custom sound and graphic chips but where the overall chip count and unit cost was ruthlessly controlled.


The sprites and backgrounds use the same memory as the main CPU so more sprites and more complicated backgrounds means the CPU gets stalled more.

On the C64 for instance the first line of a character (so one every 8) is when the graphics chip fetches the list of characters on that line and that line is a ‘bad line’ where the CPU has much less cycles. And then the CPU is stalled for 50% of the rest of the time so the graphics chip can fetch the character bitmaps indexed by the data fetched during the bad line.


The first time I've got my hands on a Z80 clone it was like "my precious". For example AY-3-8500 was not unheard of but was unobtainium in every respect.

If you're considering clones in East Europe, I think the real reason was economic. It was useless to have a complex circuit when the devices were not available. For this reason, most early clones of Sinclair Spectrum used only 7400 series circuits around Z80.


Some 8-bit arcade systems also had multiple CPUs; IIRC Galaga had 3 Z80 CPU's working together and I think Qix had 2. Would have loved an dual or triple CPU 8-bit computer in the 80's.


Arcade cabinets are the wrong comparison. Better are home consoles, in particular the NES/Famicom. A fine-scrollable tilemapped background + hardware sprite multiplexing made it trivial to develop games for the Famicom that would be technological marvels on contemporary home computers, yet no western computer manufacturer came up with that design, despite being small additions on top of text-mode graphics and hardware sprites.


Some western micro computers did have some of that though. The C64 did have hardware scrolling, which is why you see a higher calibre of scrolling games on that machine than you would on the Amstrad CPC 464 (which did not have hardware scrolling).

It all comes down to expense though. You could throw the whole kitchen sink into the computer but then how much would it cost to manufacture and how much would you need to sell the thing for?

Costs need to be cut somewhere and micro computers had to additionally ship an interpreter ROM (typically some dialect of BASIC), more expansions ports (for printers, serial modems, memory expansions, additional storage devices, etc) a physical keyboard, and so on.

Plus lets also not forget that while many micros were sold as games machines - they were originally built as hybrid devices for doing work, finances, etc as well as recreation.


That sprite hardware also gave the C64 a reputation as a "game machine" in an era where the big money was in business sales.

There was a lot of "serious business only" attitude in many of the computers those days.


The hardware scrolling in the C64 is pretty trivial, it can only shift the screen up to 7 pixels horizontally and vertically and mask the borders a bit. If you want to do scrolling you have to move the memory contents around which costs a lot of cpu time.

On a console you typically have a circular buffer and the screen is a movable window on it, so you only have to paint one row every time.


Atari 8-bit could fine scroll horizontal and vertical. In 1979.

https://www.atarimagazines.com/compute/issue67/338_1_Atari_F...


Which is the same circular buffer. Great if you can spare the memory, it uses twice as much if you want to scroll in one direction or four times if you want both directions.

On the C64 that’s 8k which is a lot if 16k is all you get (the video chip only has enough address lines to address 16k).


Still, it wasn't like you had to push all the pixels. You only moved 8x8 (or whatever it was) characters. So the C64 was pretty decent, actually. Also, of course you had to shift characters once every 8 pixels.

I was envious as hell on my MSX, which had no pixel shifting at all.


It’s very helpful and the implementation is ingenious but not very complicated.

Vertical scrolling changes the timing of the bad lines and horizontal scrolling the timing of the bitmap data fetched and this causes the image to come out at a different position.

There are a lot of creative tricks you can do with this though, especially with vertical scrolling.


> (I will also forever wonder why Sega's 16-bit console didn't have hardware sprite scaling and rotation, since that was the main attraction of Sega games at the time...)

An interview with one of the hardware designers on the Mega Drive/Genesis VDP suggests they wanted this, but that didn't have enough die space. It's also not clear how comparable to their arcade hardware this would have actually been even if it was present. Scaling and rotation requires non-sequential access to the source data which the VRAM used in the MD/Gen is not very good at. My guess is that the limitations would have been similar to what you saw with the scaling hardware on the Sega/Mega CD (i.e. relatively low frame rate due to the limited bandwidth available for updating VRAM).


I don't buy that at all as well. They could have made the console a bit bigger, and a bit more flexible inside.

I wasn't asking for the full powerdrift arcade experience, but something that can emulate it to a certain degree.


> They could have made the console a bit bigger, and a bit more flexible inside.

They could have gone to a 2 chip solution for the VDP and implemented scaling, but it would have substantially increased costs. Now, Nintendo did notably go with a 2 chip solution for the PPU, but they were using a lower cost CPU, launched 2 years later and sold for a higher price.

There are other places they skimped that hold up much worse than dropping scaling hardware. For instance, having seen photos of the VDP die it seems pretty clear they could have made a nice improvement to the palette hardware if they took more time to optimize the layout (though I imagine that could have taken quite a while with 80's IC design tools).


Don't forget that arcade machines capabilities could be specialized. Consoles, and even more so, home computers, had to be a lot more general-purpose.


> The entire remaining space on the mainboard is dedicated to the video decoding hardware (besides a couple of RAM and ROM chips).

This wonderful post points out a fascinating cultural shift that I've never seen written up but is visible in the traces of various papers.

Although there was a Cambrian explosion of computer designs in the 1960s and 1970s they did follow a common evolutionary path, to whit: there really was a central processing unit, and around it fed various peripheral parts, primarily memory, I/O (channel controllers) and the like.

A lot of the vocabulary and assumptions of minicomputers and microprocessors came from this world, though the big difference in the late 60s was letting the CPU do some of the I/O (and one of the weird things about C, fundamentally a minicomputer language, was that it didn't have IO keywords or commands).

So when the Alto was designed one of the biggest weird things about it was not the bitmapped display itself but its support: the bus speed was only 3/2 the screen refresh rate! That blew people away (in fact if you wanted to do a lot of computation you'd end up blacking out most of the screen for a while).

==

The games developers went the other way: they started with hardcoded logic and only later were able to use MPUs. so to them they'd just accelerate part of the design with the computer. You can even see this as the domains started to merge; the early Atari computers like the 400 and 800 weren't only what we'd call today game consoles but had built-in sprite hardware that you could call from your programs.


I've always been interested in the 128-bit Naomi hardware which Sega used around the time of the DreamCast. Cabinets were hooked together via fiber supposedly- this is how at Disney all the Daytona USA machines were connected together. There is a picture of two motherboards connected on this site: http://sega-naomi.com/hardware.htm and I wonder how different their protocol is from actual computers.


What is supposed about it? There are pictures right there. That hardware is mostly off the shelf components (SH-4 cpu, PowerVR Gpu, arm based sound dsp), the thing is basically just an “actual computer.”


Daytona USA used the Sega Model 2 arcade hardware while Daytona 2 used the Model 3. None used the Naomi(a beefed up Dreamcast of sorts modified for arcade use). Multiple machines could have been networked together I dont know how the older model boards did it but the Naomi used standard R45 network port(at least for Netboot) and it was standard TCP IP stuff.


And a direct link to his WebAssembly version of the emulator: https://floooh.github.io/tiny8bit/bombjack.html

Took longer to boot then I expected, but after that played quickly.


Second stage music of Bombjack is clearly a rendition of Lady Madonna by The Beatles.


Ah, Bomb Jack. I loved playing this game on my Spectrum, had a MAME version in 2001 while studying my masters, and now it runs perfectly in a browser. Amazing. As is often the way with procrastination I became pretty skilled at my chosen distraction, memorising all the patterns with which to get a high score. Let's see how much I can remember!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: