I do really like how cartridges on the old systems were essentially equivalent to a PCI Expansion Card in a PC. It was directly connected to the Bus and could do essentially anything. Sadly, that practice ended after the GameBoy Advance, and everything since the Nintendo DS has been purely data storage.
This leads to crazy modern enhancements like a Raytracing chip[1], or the MSU1 enhancement chip that is AFAIK not available as an actual physical chip, but only in software emulators. But it would be theoretically possible to manufacture, so you could have an actual physical SNES Cartridge of Road Blaster[2].
On the article itself, I noticed that his list has "Street Fighter Zero 2" as a USA ROM - that should be incorrect, since Street Fighter Zero is what Street Fighter Alpha was called in Japan. So Zero 2 should just be the Japanese version of Alpha 2. (Also, thanks for linking to the MVG video that debunks the myth that the delay before each round is caused by decompression)
There is also this entirely crazy "NES reverse emulation"[1] Here someone replaces the cartridge with a modern computer and then... weird things start happening. Such as using a NES to present effectively a powerpoint presentation about humor.
I did that with PalmOS - a card that you insert into the original Pilot (16MHz 68k) and it takes over, runs PalmOS5 on a 200MHz ARM chip, using the pilot as a display and touch panel. Also did it using memory stick IO protocol too (Ctrl+F for "MSIO")
As someone who got in to the PalmOS world just before OS 5 was shown off and then spent years waiting to upgrade for the OS 6 device that never came I've been fascinated by your project since I first came across it. Amazing work!
Amusingly the device that replaced my classic Palm was an Axim X3, which if I hadn't sold it to buy a server in 2005 would probably have already had your rePalm build installed on it.
That's not entirely true. A few DS carts at least had IR receivers (e.g. Pokemon HeartGold). I think Learn with Pokémon: Typing Adventure adds Bluetooth through the card as well. So there was some limited capability to add features, not quite as exciting as extra CPUs but it's not like any GBA did anything super exciting there either.
Not that it was an official licensed product, but for what it's worth, there was a DS flashcart that bundled a significantly faster CPU than the DS's own - the SuperCard DSTWO [1]. The extra CPU (Ingenic Jz4740, MIPS) was primarily used for GBA emulation on DS/DSi systems without the need for a GBA slot passthrough flashcart, as was otherwise required - though there was a quite successful homebrew scene around it at the time as well, up to and including stuff like a (proof of concept) port of a PS1 emulator [2].
It was a surprisingly impressive device! The only downsides, as I recall, were the increased battery drain and the fact that the DS Slot-1 bus was only fast enough to allow streaming video output from the cartridge to the system's displays at ~45FPS, irrespective of the rate that the emulator was actually running at internally.
There are some that use the GBA slot for expansion too. The DS browser uses it for a memory expansion, and there's a Guitar Hero game that uses it for some extra buttons. When you boot the console with one inserted in Manual Mode it calls it a "DS Option Pak", and there's a lot more than i thought https://nintendo.fandom.com/wiki/DS_Option_Pak
It's like the difference between slotting ROM into DDR5 slot[1] vs SATA port. You can still add features by means of fake disk I/O, but that isn't the same as directly interfacing with the CPU bus.
1: perhaps more perfect analogy will be socketing ROM as its own chiplet if that ever made sense
Guitar Hero ("On Tour" ) for the DS added additional controler buttons to allow you to hold it like a guitar Hero guitar and use their buttons in the game. Was a really cool approach.
There was at least one Game Boy cart with fun features: Aprilia had a GBC cart[0] that had an interface cable for diagnostics on one of their scooters:
Now I am curious to know how some games could have an IR transmsmitter inside the cartridge like Pokemon Soulsilver, did they plan this specific usecase or a whole channel for limited cartridge expansion components ?
That's actually a pretty good question. The NTR-031 IR cartridges were used for a few games, and there was also one for Motion Sensors and a TV Antenna.
From what I can see, it looks like even though they can't extend the functionality directly, they do still interface with the system in some way. Perhaps they are more like a device connected to a serial port would be? So far less capable than the full extension cart, but still possible to communicate with through a standardized protocol? (The DS Cartridge slot has 8 data pins rather than the full address/data bus, but seems to have a protocol: http://problemkaputt.de/gbatek.htm#dscartridgeprotocol)
(Though if someone with more knowledge can correct me, please do so. I thought that DS/3DS/Switch are pure flash chips, but was wrong at least about the DS)
Hmm. Crazy thought, but there must be some sort of save data function. I wonder if one could exude a ram chip(as opposed to short life flash or eeprom back then) as a storage device, with some fake filesystem, then have the software write updates to that.
As ram, you could read the data on the cartridge, modify it, and thus fake bidirectional comms?
Even if the bus/protocol is limited to "only" disk I/O, you could still have the controller interpret reads/writes to certain addresses as requests for other actions, including interaction with other hardware on the card
The interface to the gaming system might be strictly data, but you've got power supplied to the cartridge; you can put whatever hardware you can fit inside that cartridge still
Widely believed to be because Mike Bison was a little too on-the-nose for a heavyweight boxer and the shuffle was the easiest way to make a last minute change.
Someone already mentioned Mike Bison, but FWIW, in the fighting game community - which includes people from Japan and outside of Japan - those characters are generally called "Boxer", "Claw", and "Dictator" specifically because of the localization changes, to give them unambiguous names.
At least Sagat kept his name, though "Tiger bullshit" would've been a great moniker, and no, I'm not salty about losing to his projectile once too many.
Another detail not mentioned here is that even carts which didn't contain enhancement chips had different levels of performance - the SNES CPU nominally ran at 3.58mhz, but it would only actually run at that speed with a "FastROM" cart inserted. Nintendo also offered publishers a "SlowROM" cart format, which was cheaper, but would downclock the CPU to just 2.68mhz.
There is a community of modders developing patches to turn SlowROM games into FastROM games to alleviate slowdown. I read somewhere that some SlowROM games appear to have been developed for FastROM in the first place, but were converted to SlowROM at the last minute due to penny-pinching demands by the publisher.
Out Of This World is a case of SlowROM/FastROM where the dev was not allowed to use FastROM. Iirc they claimed it saved a whopping 50 cents a cartridge to use slowrom in that case.
Probably. For a game selling at 60$ assuming Nintendo took half and publisher took other half in profit, its 1.6%. Still significant as the publisher has other costs.
The thing that blows my mind about the SNES is that even its onboard RAM can't be accessed at the nominal 3.58MHz clock; the system slows down for that.
Always confused me why the SNES was so paltry with its speed when the competitor TurboGrafx-16 usually ran at 7MHz, and also had a 6502-family CPU that required similar memory timing. But the TurboGrafx flopped (in the west) and the SNES was a hit world-wide, so I guess they did something right.
> But the TurboGrafx flopped (in the west) and the SNES was a hit world-wide, so I guess they did something right.
Still have all my old consoles including the Turbo Graphics 16 and SNES. It was all about the software. Mind you this was during a time when games were $50 each which today is something like $100. If you wanted to sample a game you hoped a friend had it so you could borrow or the local video store had it to rent. If not it was a crap shoot and game review magazines were a staple.
TG16 had nothing to compete. I only remember the popular side-scrollers like Bonks Adventure or Splatter House. The rest were "weird" games no one was interested in. We had about 7 or 8 games before we gave up on it. I had I think one other friend who had one who also gave up on it with just a few titles.
I personally think that Devil's Crush represents the very best of video pinball, and that's a TG16 exclusive.
There was a time around the turn of the century when some friends and I shared some studio space, wherein we had a long-lasting (years!) high-score competition for that game. (Scores had to be witnessed, and were then written on the wall by the door.)
I definitely lucked out on my dad being in touch with the technology at the time. I was ready to get a TurboGrafx 16 just for Bonk's Adventure, and I definitely would have missed out over that purchase. The SNES was so popular, that I had many friends that were willing to trade games with me, so we were able to experience new games without the stiff price of having to purchase everything played. One of the things gaming companies made sure to kill off as things turned to fully digital.
I still want to play all of the Splatter House titles, though. I didn't find out about those until I was older, but I've watched plenty of walkthroughs of the various iterations, and it looks like the perfect amount of strategy, gore, and need for proper timing to master.
SNES had 3 background layers and color math/transparency, TurboGrafx 16 was stuck with one background layer. SNES also had much better sound capabilities.
I'd say TurboGrafx's biggest advantage was that they had a full handheld version of the system available very early on. The Turbo Express crushed the Game Gear and Lynx in terms of power. (The Game Boy still beat all its more-powerful competitors because of its far lower battery consumption)
I'd say the Game Boy also won in terms of library of games, and the popular properties owned by Nintendo, such as Super Mario Bros., Metroid, The Legend of Zelda, among many others.
Yes but that was due to it's popularity, not the cause of it, there is quite a bit of network effect, but that only comes into play once you have the network.
The gameboy won because it was half the cost of any of it's competitors. while being more than half as good.
Technical specifications rarely have to do with whether or not a console succeeds, as long as they don't affect MSRP too much. Price and game library are typically the most important aspects. Or ease of piracy in some territories.
Always wondered if that was just an arbitrary licensing distinction made up by nintendo or there were actual differences between carts data reading speeds. At the end of the day, all SNES carts sported mask roms.
At various points it was very hard to get ROM made due to shortages. It may have been an attempted solution to that, can’t get fast but you could have slow.
The ROM chips were literally lower specced / worse tolerances, and thus cheaper. FastROM chips were guaranteed (by the manufacturers) to yield stable results in under 120ns, versus 200 for SlowROM.
SNES emulation is extremely accurate at this point - any modern/reasonably competent emulator handles this correctly. Shoutout to Near (RIP) here - they wrote an article on Ars which documents how we've reached this point.
On top of this, most emulators allow you to run the game at faster speeds than the original. I haven't encountered a game yet where I wanted to do this, but the fact that it's available just shows how far emulation has come (and the wizardry that is involved in emulating an entire computer system).
Ostensibly the reason is that the ROM chips have to run in lockstep with the CPU clock, so the CPU may need to be downclocked if cheaper ROM chips can't keep up with it, but I don't know what the actual cost difference would have been at the time, if any. Maybe all of the chips were capable of 3.58mhz and Nintendo was just gouging.
No that's something else, LoROM and HiROM differ in how the ROM data is mapped into memory, while FastROM and SlowROM differ in the bus speed. You can have Lo/Slow, Lo/Fast, Hi/Slow or Hi/Fast carts.
I hope developers still love to blog details with article mode like this, rather than vlogging it on YouTube. A lot of detail packed into a few kilobytes only.
"Super Mario World" is still the masterpiece game ever. It has amazing characters, sprites, and stages packed into only 360 KB.
The file sizes given by the site are wrong. Super Mario World is 512KB, or 508KB if you discard the padding at the end. Only compressing it to ZIP format gives you a file size around 360KB.
I think applying RLE to the file might give a better estimate, because files have blank space throughout, not just at the end. Just tried it on Super Mario World and the estimated size was 479,154 (468K) bytes.
If you don't have an RLE tool handy, you can force Pucrunch to act as an RLE-only compressor by using the -r 0 switch which disables the LZ compression feature.
hexdump will automatically does "squeezing" of repeated lines. Follow this with a line count and multiply by the bytes/line and you'll get a rough number of non-repetitive bytes.
https://man7.org/linux/man-pages/man1/hexdump.1.html
The goal here isn't to tell the real file sizes, the goal here is to estimate the "effective" file size without any padding. Padding can be at places other than the end of the file.
It's still quite impressive to fit such a lengthy and fun game into 508KB without any procedural generation, just tight assembly and clever use of bitmaps. This is programming art.
This is largely a lost art because of several bad paradigms that have become very popular in recent years, making software exponentially worse in the process. In the old days, it was common and NES games were even smaller than SNES games because the graphics were much less complex. I believe Super Mario Bros 3, which many people still insist was better than Super Mario World, actually fit into 32KB.
As the linked article notes, some of the special on-cart chips were mainly used for data decompression, like the SPC7110. So on those games, definitely compressed assets!
But... I'd love to know how often assets were compressed on "normal" carts without special chips.
Decompressing assets on the fly during gameplay action seems like it would be quite a challenge for the SNES' CPU.
My understanding is that images on title screens and cinematics were often compressed. Anime/comic art styles lend themselves really well to RLE compression because you have lots of consecutive pixels of the same color. And, obviously, these can be fairly static images that don't need to be updated 60 times a second.
Definitely an outlier, but: the title screen of Secret of Mana was actually a JPG that took around a minute (!!) to decompress. The music and scrolling text are cleverly designed to mask this: https://manaredux.com/lore/how-was-the-incredible-title-scre...
From what I've seen, on average, with many exceptions, most games will lightly compress most data. Tile graphics and layouts only need to be decompressed whenever a scene/level/map is loaded. Games aren't doing streaming audio or anything, so song data can be decompressed and uploaded to the sound processor before a level starts too.
Likewise, text needs to be decompressed once immediately before it's displayed, so games will usually compress that - it's quick enough to decompress a few hundred bytes while the text box loads.
The only thing that I've seen that's usually stored uncompressed is sprite animation data - particularly for player characters with lots of different animations. There's not enough VRAM to load all of it at once, so it needs to be streamed in, and in that case the CPU often just doesn't have the muscle.
I was actually just thinking last night about how something like Fortnite handled emotes. Skins and cosmetics I can understand, but they have 100 players and hundreds of emotes that any player can use at any time. Is it all really just streamed in on demand? Most common preloaded?
Maybe a little counter-intuitive, but Fortnite emotes (essentially, animations) takes up much less memory than skins.
Generally for a modern polygonal game, you are using skeletal animation for the characters. A character consists of a polygon mesh and anywhere from dozens to hundreds of bones.
Animations consist of keyframes. Each keyframe represents just a handful of bytes for each bone (the XYZ coordinates for each end of the bone, and the rotational angle, or something like that). The animators create as many keyframes as they want, maybe 10-20 per second max. So a 5-second emote animation might contain something like 8 32-bit floats * 100 bones * 10 frames per second * 5 seconds = ~120 kilobytes of uncompressed data.
That's all you need to specify an animation. The rest is calculated on the fly at runtime and rendered to the screen at 60fps or whatever the current frame rate is. The graphics engine interpolates bone positions between your keyframes, and the player model mesh is deformed by the bones. Also, those skeletal animations can be shared between all player models.
The alternative to skeletal animation is fully prebaked animations. This involves minimal interpolation and calculation at runtime. It is more memory intensive because you are calculating the position of every point on the mesh ahead of time, and then storing that data on disk. This is generally how a very complex and non-interactive animation (think: cutscenes, etc) would be animated and stored on disk. Note, this is still far less storage-intensive than storing rendered video, and you still maintain a great deal of flexibility at runtime - you can change the camera location, rendering passes, resolution, etc. That's why you don't see a lot of prerendered video cutscenes these days.
I guess you are right. I never actually did the math on animation storage but I just somehow assumed it was bigger. They obviously use a standard character model (or at least a few) that every animation works with. I just assumed it would be bigger esp with the number of emotes modern cosmetic games have.
Super Mario World is great, but I stand by Donkey Kong Country 2 being the best game on the system. To me, it just nails every single aspect of platforming, with awesome music, tight controls, and appealing graphics.
Terranigma is right up there as well for me; Super Mario World is probably number 3 in my book.
> hope developers still love to blog details with article mode like this, rather than vlogging it on YouTube.
I think the driving force behind this for many is it's much harder to steal video content. With text bits can scrape it change a few words and reuse it on an SEO ad site.
I've definitely noticed as a blogger many sites are so brazen stealing content that they'll even hotlink to my images, so I can find their copying in my server logs. Very annoying.
Isn't it more of "Pitfall!" has a set of 255 levels that are procedurally generated and the author spent a ton of time searching for the best seed that would generate the levels we know today.
It isn't as if you could have a map editor and adjust the levels or the order of the levels, though you could run the generation procedure to find new levels to your liking.
David Crane, the creator of Pitfall, explains it all very nicely in a GDC Classic Game Postmortem [1]. I'm linking at the relevant timestamp, but the whole video is pure Atari 2600 goodness. The full Postmortems playlist is quite enjoyable too. I particularly liked the talks about the original Deus Ex, Myst, Loom, Adventure, Marble Madness, Ms. Pac-Man, Paperboy, Lemmings, and more. I find these videos very soothing and nostalgic. Got to rewatch them now...
Take a look at Solaris. Another huge game with 48 sectors and probably the best graphics seen on the 2600, all in a 16k cartridge.
Quite a feat considering how horrifyingly difficult programming the 2600 was. Even the SDK was fairly barebones, basically a VT100 connected to a PDP-11 running RT8 and a 6502 assembler
I wonder what you could do with modern tech, exploiting that ability of having "enhancement chips" in the cartridge.
The SuperFX is mentioned to have it's own framebuffer and copy the whole thing over to VRAM.
Does that mean, it would technically be possible to put some ridiculously overpowered SoC into an cartridge, and use that to render modern graphics (at SNES resolutions), copying the resulting frames back into the SNES VRAM?
but apparently the NES is a lot more limited, in that it wasn't really designed to accept enhancement chips. still absolutely amazing that he can run emulated SNES games on real NES hardware!
but the SNES actually had that ability to accept enhancement chips - as seen in the SuperFX and others... I feel that should allow for doing drastically more!
The NES was a little unusual in that it basically had no video ram. The PPU (the graphics chip) rendered sprites and background tiles straight off of the cartridge at 60fps as if the cartridge was an extension of the CPU's address space.
So there are almost no limits to what you could do with expansion hardware, other than the fact that everything would have to ultimately be rendered thru the NES' limited color palette. You could make a cart that provides an interface to let a monster PC with multiple 4090s treating the cartridges' tile data as a "dumb" framebuffer and run Cyberpunk 2077 through an NES... although, again, in 4-bit color hahaha.
In contrast, for the Genesis and SNES, you need to copy graphics data from cartidge to VRAM. Then it gets rendered. This was presumably done because ROM of the area was not fast enough to feed the graphics chips of the 16-bit systems, thus the need for local VRAM as sort of a cache.
So on a SNES you'd have to do some more work, you'd have to DMA a whole screen's worth of data from the custom cart into VRAM 60 times per second, which I think exceeds the data rates the SNES can achieve.
My understanding may be wrong, somebody correct me
The Super Game Boy cart is basically this. The cartridge contains actual Game Boy hardware, wired up to render to a frame buffer, and 60 times per second it's copied to the SNES VRAM.
The trick is that the Game Boy display is smaller than the SNES display, so it doesn't have to transfer a complete frame, and so it can be completed within the vertical blanking interval.
Apparently the timing is so close that it's not possible to transfer palette information alongside the video, just black-and-white, which is why there was never a Super Game Boy Color.
> The NES was a little unusual in that it basically had no video ram. The PPU (the graphics chip) rendered sprites and background tiles straight off of the cartridge at 60fps as if the cartridge was an extension of the CPU's address space.
Unusual for home consoles, but it was pretty standard for arcade boards. This is why systems like the NeoGeo and the CPS-1/2/3 could handle massive amounts of sprites and animation that home systems couldn't replicate until the Dreamcast.
> The author of DOOM for SNES, Randy Linden, did not have access to any documentation about the GSU chip or even DOOM source code. He reverse engineered all of it
Technically this is impressive, but why was it necessary?
Randy Linden, the port's sole programmer, initiated the port of Doom for the Super NES on his own initially, as he was fascinated by the game.
Since Doom's source code was not yet released at the time, Linden referred to the Unofficial Doom Specs as a means of understanding the game's lump layout in detail. The resources were extracted from the IWAD, with some (notably sprites such as the player's sprites and the original status bar face sprites) unused due to technical limitations.
According to an interview, due to lack of development systems for the Super FX, Linden wrote a set of tools consisting of an assembler, linker, debugger, dubbed the ACCESS, on his own Amiga before beginning development of the port proper. For the hardware kit, he utilized a hacked Star Fox cartridge and a pair of modified Super NES controllers plugged into the console and connected to the Amiga's parallel port. A serial protocol was used to further link the two devices.
After developing a full prototype, he later showcased it to his employer, Sculptured Software, which helped him finish the development. In the interview, Linden expressed a wish that he could have added the missing levels; however, the game, already the largest possible size for a Super FX 2 game at 16 megabits (approximately 2 megabytes), only has roughly 16 bytes of free space. Linden also added support for the Super Scope light gun device, the Super NES mouse, and the XBAND modem for multiplayer. Fellow programmer John Coffey, himself a fan of the Doom series, made modifications to the levels, but some of those modifications were rejected by id Software.
I was lucky enough to work with Randy for quite a few years at Microsoft.
Incredible developer. He also made the Bleem! PlayStation emulator.
Funny enough none of his coworkers ever bothered to look him up online until after we all stopped working together at which point we all learned we'd been working with programming royalty.
I know that Wolf3D on the SNES uses Mode 7. Not for the walls or sprites, but for the entire screen. The graphics are rendered into a background tiles with a resolution of like 175x100 or something, then scaled up with Mode7 to fill the 224x192 screen. (those aren't the exact numbers, but you get the idea)
The "mosaic trick" is a way to perform horizontal pixel doubling in hardware rather than software. And to do this trick, you turn on the SNES's Mosaic feature, scroll 1 pixel to the left every other scanline, and scroll upward one pixel after each two scanlines have been drawn.
Normally the SNES mosaic feature just the top-left pixel of a 2x2 square into that entire square. But the trick makes a different set of pixels get doubled horizontally on the next scanline.
It requires a different arrangement of pixels than the normal way of drawing tiles. A tile containing these pixels:
01234567
becomes this when viewed on two scanlines:
00224466
11335577
Actually performing these scroll writes does not require any CPU intervention because you use the SNES's HDMA feature to do those scroll writes.
A similar thing happened with the Wolfenstein 3D port as well, where John Carmack gave Rebecca Heineman kudos for learning Japanese to read the patents to get the technical documentation, always cool history around these things, some more in my post about it here:
https://eludevisibility.org/super-noahs-ark-3d-source-code
I can't speak to this case, but dev kits and SDK/documentation are often two separate SKUs and the latter has a higher price. If I remember the Crash Bandicoot guys found a hardware bug with memory card saving because they rolled their own code rather than using the SDK they didn't have.
Where do the byte counts for the various games come from? The games came on ROM chips that were, as ROM chip are wont to, sized to powers-of-two. For instance, Super Mario World was shipped on a 512kb ROM - where does 346,330 bytes come from? Are these compressed sizes?
The numbers are estimates based on zipped size. But it is a bad idea. I should write a program to extract each zip, and count zero bytes padding at the end of the file.
Too late for today, I will write that tomorrow and update the article.
It looks like they're compressed sizes. If I gzip SMW.SMC, I get a 347 KB file. It's very misleading.
There are other issues in there. The writer makes it sound like MVG was the one who discovered that the pause in SFA2 was from loading audio data, when it was already known LONG before his video. https://forums.nesdev.org/viewtopic.php?p=70474#p70474
He also seems very confused about the RTC. It's obviously so that the clock keeps going even when the console is off and the cartridge is unplugged, like in the GBC/GBA Pokemon games, but he says something about how it might be because of the NTSC clock drifting? ??? What?
> Even Super Mario World[10] got the treatment (I can't remember slowdowns but I was only twelve can then).
Yoshi’s Island 4 has a slowdown in some circumstances (have Yoshi, get Starman and hit P-Switch), as does another level I can’t recall, exactly… it has a bunch of Monty Moles that explode all at once. I think it’s on Chocolate Island. I think there might be a third with two Sumo Bros. and an Amazing Flying Hammer Bro. onscreen.
Something I've never been able to wrap my head around is how ROMs are dumped for emulators from cartridges? Dumping instructions and assets makes total sense to me, and packaging that up in a data file that can be interpreted by an emulator too, but how does an emulator model the hardware of every 'expansion' chip in a cartridge? How is that dumped from an original cartridge?
Expansion chips aren't ROMs; they need to be emulated as well.
The situation was IMHO a bit worse with the SNESs precedessor, the NES.
There were quite a few expansion chips--called mappers--even though their general function was expanding the NES's memory space instead of adding additional processors or capabiliies - and they were in most games because without them the NES is limited to 32KB of PRG ROM and I think 4KB or 8KB of CHR (graphic) ROM. Most games after the year the NES came out had them.
These all had to be reverse engineered along with the console itself - fortunately much simpler than reverse engineering an add-on CPU or accelerator though. Some are common and in many games (MMC1, MMC3) and others are pretty much for a specific game only (MMC2 is for Punch-Out only).
They're not dumped. The emulator implementation recreates the expansion chip functionality in software. There are only so many expansion chips, so its not intractable.
> The copy-protection mechanism of the SNES is something I already dig into in my 10NES article. It works by having two chips talking in lockstep. One chip is in the console, the other in the cart. If the console CIC sees something it does not like, it resets every processor.
It was easily defeated for consumers though (but probably not for game producers/editors)...
Back then we all had a "backup device" for our SNES: a device you'd plug into the SNES and which had a floppy drive. So we'd "backup" all our games on very cheap 3"1/2 floppies. All that was needed for the system to work was to have one original cartdridge, which you'd plug into the copier, and the device would reuse that cartdridge's CIC chip.
> There were three versions of the DSP-1 named DSP-1, DSP-1a, and DSP-1b. While introducing bug fixing and improving the process, the chip behavior was slightly altered which resulted in planes in Pilot Wings demo crashing into the ground
OK, I will use that excuse if someone asks me why I am so bad at it.
I'm wondering why the console CPU wasn't running at 4x the clock rate to begin with, if the cartridges could easily and cheaply include one so much better. I guess that's just how fast CPUs were improving back then. A CPU from just a few years prior was that obsolete. Crazy!
Unlike modern chips with cache hierarchies and the like, the 65816 is designed to run in lock step with the memory it’s accessing. A faster CPU would have necessitated faster RAM in the system and faster ROMs on the carts too. Games were expensive enough as it was, then.
I also wonder why they couldn't provide a second cartridge slot for the enhancements. Sell enhancement carts separately or bundled with some of the games that require it (mostly the first game to feature such enhancement) and then all other developers could consider using them too, without worrying about the cost of their own cart.
I imagine this would add to much cost or complexity to the console, making it simpler to just bundle the chips in a single cart anyway.
It would result in a situation where nobody could depend on the expansion module, if they wanted their game to have the largest possible market.
It would also cause a lot of confusion, where clueless older relatives would buy games for kids, not realize that an accessory was required (or have no idea if the kid actually had that accessory), and then the game wouldn't run.
We see this sort of problem happen a lot with computers of the 8-bit era as well, where add-on modules would fix a lot of the issues with the base system... then be supported by almost no software for these exact reasons.
> It would result in a situation where nobody could depend on the expansion module, if they wanted their game to have the largest possible market.
The SegaCD-32x problem. The Genesis sold tens of millions of units, the SegaCD only ones of millions of units, and the 32x under a million units.
There were a couple 32x games that required the 32x and a SegaCD. Being that the SegaCD and 32x didn't have 100% overlap, those games had a smaller TAM than even solely 32x games.
I feel bad for the studios that decided (or were told) to make those games. There was just no way they were going to make a game that would sell well on an uncommon configuration of a dead-end console/accessory.
The difference here being that those enhancement carts would likely cost less then 20$ to manufacture and would either cost a little more then that when sold individually or cost a little bit above the price of a full game when sold bundled with a game.
I can see how this would create some confusion when buying games (which may be a deal-breaker for nintendo), but I also see more potential in this approach since these carts would be "seeded" by Nintendo's own high volume and highly sought after games, thus getting much more traction than the big console accessories of the competition, which cost around (not 100% sure) 150$. Also, as another poster noted, the n64 had the expansion pak (not too far from the idea of an enhancement cart) introduced a few years into it's life which would end up being bundled with a few games and get fairly wide support on many newer releases, tough most of the games opted for optional support to unlock extra features.
sort of yea, it seems originally they wanted to use a 68000 but late 1980s chip shortages and other costs made it difficult so they went with a 6502 derivative [1]
When I first learned about how complicated cartridges were it blew my mind. Each cartridge cost $10-25 to make, and N64 was even more expensive. For years now though the cost of delivering the game itself has been essentially 0. It's just wild to me that a huge percentage of the price of a game went to 'nobody' in a way, and just to buying the chips it took to make the cartridge. It likely made up a bigger cut then actually went to the developer of the game.
It’s impressive that developers could afford to make custom ICs for only one or two games. I wouldn’t have thought the revenue would be enough to justify that.
My assumption was always that they went into the design of it assuming they would use the chips for more games. Then either the game didn't sell well or the SNES was at the end of its life or something happened to prevent them from using it again.
It's that some consoles like the (S)NES were extremely popular, sold in the millions. And some games sold in similar numbers. Or were expected to. Or custom IC was (expected to be) used in a # of games. Or some of these publishers were just awash with cash + crazy high expectations for their products. Or some combination of the above.
Iirc for custom ICs, you're usually talking about production runs in the many-thousands. Say, 10k+.
But in the above example: if $60 provides a $10/sale budget, 10k sales gives you a $100k budget to work with. 100k sales -> a cool 1M budget.
Numbers add up quickly if it's mass-produced and 'everyone' buys one.
Yes for reference Street Fighter 2 was sold for AUD $120 then which works out to be about AUD $270 when adjusted for inflation. It was priced more than the other games but you get the idea.
For reference a typical Nintendo Switch game will sell for AUD $79 on release.
Really specific question, but can anybody explain what looks like a bodge resistor on the Star Fox cart (on the missing 74HCT04 chip in the bottom right)? Everything else is SMD, so it looks to me like they realized some mistake after production and had to manually fix every cart. Or maybe this is just a scan of a hacked/prototype cart.
I just opened my Japanese Starfox cart. It must be a later revision because it's a half height board with the 4 chips covered in black epoxy. It still has that resistor, although I'm not sure if it's connecting the same traces because the board is completely different.
This Reddit thread seems to suggest that it's from an early production run of the game. (though that seems strange to me as normally the glob top chips come as a cost-cutting measure on later revisions of a chip.)
> One of the exceptional characteristics of the Super Nintendo was the ability for game cartridges (cart) to pack more than instructions and assets into ROM chips
Wasn't this true even on other Nintendo consoles? Gameboy and Gameboy Color cartridges did similar if not even more outlandish things, like the GameBoy Camera
Gameboy Advance could do it too. Yoshi Topsy-Turvey and WarioWare: Twisted had gyro sensors. Boktai had a light sensor. Also the Nintendo e-Reader for scanning cards.
Even the Nintendo DS could do it, though I'm not sure it was used outside of flashcarts like the SuperCard that included an additional CPU.
It would probably be more precise to say that it was true for cart-based consoles: likely due to backwards compatibility with the GB port Nintendo supported enhancement carts until the DS (though by then it’d become very uncommon).
And the N64 technically supported cart enhancements though only a small handful of japanese games used it (Morita Shogi 64 had a modem and rj11 adapter in the cart… and RCEs so you can use it as a homebrew vector).
Once cart were replaced with optical drives there was no way to have anything but data shipped with the game, so enhancements was limited to whatever the console designer had specced expansion slots or even just controller ports for.
Then again, as hardware became more complex an expensive, the ROI on bespoke chips got lower and lower. SETA released 3 single-game chips, there’s no way that’s justifiable nowadays.
I love any and all research on these systems. It completely blows my mind that something as complex and fun as Mario Kart was only 350KB in size. You can't even get a 'hello world' binary that small for most languages today.
Yeah, it's custom and tuned, but that's a lot of fun packed with at the time 'good graphics' in the size of a single low quality jpeg of today.
Well, we have to account for the fact that modern system simply don't require that much optimization and compression. Working with stricter limits meant that dev teams had to get creative and minimize file size, while nowadays a single game update can be around 20 GB.
> nowadays a single game update can be around 20 GB
Irrespective of the abundance of storage we now enjoy, and even though I could rationally understand the reasons, this will always make me raise an eyebrow or sigh.
As a millennial who has done both embedded and web it doesn't make me sigh at all and usually I find people with that line of thought are just being elitist.
The amount of technology and number and size of assets in a game now is just insane, it is in no way at all comparable to the garage projects for 8bit consoles.
Remember, games used to have to be ported - they were so locked into their particular platform/hardware that porting a game was essentially a total rewrite every single time. Nowadays we can write once, run on every single platform with just hooks into platform specific libs swapped out.
Modern day developers aren't stupid; we're just all used to our current environments, but I would bet that if we all needed to, we could just as easily jump back in to writing everything in asm and custom tailoring it for a particular CPU/hardware. But then modern games would be impossible to actually get finished.
> Modern day developers aren't stupid; we're just all used to our current environments, but I would bet that if we all needed to, we could just as easily jump back in to writing everything in asm and custom tailoring it for a particular CPU/hardware.
Some people could jump back in, but not all.
But that's actually progress! You needed to be a wizard to get anything done on eg an Atari 2600 at all. Nowadays, game development is accessible to more and more people.
This is peak hackernews right here. No, modern developers couldn't just "easily jump back into writing everything in asm". More importantly, no one was talking about writing everything in assembly, he was complaining about 20 GB of bloat getting dumped on everyone who buys a game.
Maybe in 20 years people will be complaining about all games taking up 10 TB of space and we'll have people rationalizing the practice because people used to complain about 20 GB patches. Games coming on multiple CDs was a painful experience. Nobody likes that, and nobody liked day-one patches back when they were first introduced and now nobody likes 20 GB patches. Every generation hates pointless bloat.
For many of us it's an engineer's mindset. We appreciate games for their art and gameplay, and we also appreciate them for their engineering.
So it's a little sad to see that one aspect of game engineering become relatively extinct, even if it's the certainly the correct tradeoff given today's constraints.
> But some kind of 1:1 correspondence was lost ages ago.
The ratio has been shifting all this time. There wasn't a one time shift that happened once.
> Optimizing for size, to squeeze out every last byte possible: who still does this in 2024?
You can still find that in the demoscene. A few years ago https://en.wikipedia.org/wiki/.kkrieger made a big splash. (Well, it's actually been 20 years. How time flies. But they still make small demos and games today.)
Eh, not the same. All the games I remember spanning multiple discs (Emperor: Battle for Dune took four, for example) were that large due to FMV cutscenes.
I understand that 3D textures are large files, but surely there is some hideous bloat occurring to cause the explosion in size.
And outside of games, the same bloat occurs, so it’s not just textures. “Let’s ship an entire browser rendering engine with our app” hasn’t exactly helped.
Interestingly enough, Hitman (the remakes) slimmed down their installed size by the time Hitman 3 rolled around. They said that thanks to SSDs being around, they didn't have to optimize for the slow random read speeds of HDD anymore.
Since these carts pull out the CPU bus I would like to know if anyone ever did anything unusual with a classic gaming console. Like controlling motors for a robot or wiring in an FPGA to add functionality like Ethernet/USB, running a terminal emulator, trying to write an OS, etc.
Is there even anything that would make sense? A lot of the older enhancements were because of the limited compute ability of older consoles, or adding sensors like the already mentioned gyro sensors and light sensors in GBA games. The Switch is already more powerful than anything you could reasonably add in a cart, and probably already has any sensors you'd reasonably add too. If there's anything else that could be done, that'd be interesting to find out.
> It's wild that the Super 3D Noah's Ark unlicensed game didn't have its own CIC chip and required plugging an official cartridge on top to pass the copy protection check. Goes to show the lengths publishers had to go to bypass Nintendo's strict licensing. I wonder how many other unlicensed carts used similar CIC workarounds.
Super 3D Noah's Ark is the only game that seems to have survived to modern times, but I remember there were quite a few more. Nintendo wasn't happy about the religious-themed games but knew it would be extremely bad press to go after the developers, so they strong-armed the major retailers instead. I used to see the games developed by Wisdom Tree in Christmas stores, book stores, and small regional retailers.
In non-western countries, there were pirated games that used the same "piggyback" technique to avoid the CIC, but this was less common since PCBs were pretty expensive to manufacture in smallish quantities at the time. Console game piracy didn't _really_ take off until CDs became the dominant format. Arcade game piracy actually had a pretty robust underground market, though.
The lengths manufacturers of unlicensed cartridges would go to circumvent the CIC were pretty interesting. These carts and early Famicom adapters (which didn't have a CIC) would 'zap' the CIC to keep it resetting. Then Tengen made their own CIC knockoff for their games (they made unique plastic cases for their cartidges too)
This leads to crazy modern enhancements like a Raytracing chip[1], or the MSU1 enhancement chip that is AFAIK not available as an actual physical chip, but only in software emulators. But it would be theoretically possible to manufacture, so you could have an actual physical SNES Cartridge of Road Blaster[2].
On the article itself, I noticed that his list has "Street Fighter Zero 2" as a USA ROM - that should be incorrect, since Street Fighter Zero is what Street Fighter Alpha was called in Japan. So Zero 2 should just be the Japanese version of Alpha 2. (Also, thanks for linking to the MVG video that debunks the myth that the delay before each round is caused by decompression)
[1] https://www.youtube.com/watch?v=2jee4tlakqo
[2] https://www.youtube.com/watch?v=BvIXUOr4yxU