> "The Amiga is probably the last 32-bit personal computer that is fully understood, documented and hackable. Both its hardware and software contain many gems of engineering and design. I hope that one day, we can have a simple but powerful modern computer that is at the same time as hackable and friendly as the Amiga was."
Kinda sad. Our current x86 hardware is filled with closed binaries and firmware that can't be easily replaced. That seems to be why so many _open_ laptops are using ARM.
The AmigaOS also did provide a layer of software abstraction. While a lot of games and demos poked the hardware directly, a lot of system friendly Amiga software written in the 80's can be recompiled and run just fine on e.g AROS just fine with few to minor changes (most of the changes tend to relate to differences between AROS and AmigaOS) on anything from m68k original Amigas, throuh PPC, ARM or x86 "native" or hosted on a variety of OSs.
That may be so, but the point was that it was not really the cause of the Amiga's downfall. The Amiga hardware changed substantially over the years, and e.g. with AGA (the chipset in the A1200 and A4000) and when we started installing graphics cards, a huge proportion of the software simply ran unmodified and most applications were immediately able to take advantage of it.
There was nothing like that in the PC world for the first few years of the Amiga's life cycle, and for games there continued to be nothing like that until years after Commodore went bankrupt.
While most Amiga games certainly did hammer the hardware, unlike on the PC where games of the era also hammered the hardware, the small portion Amiga games that were in fact system friendly from the start was far greater on the Amiga in the early years than on the PC (e.g. some notable titles included Cinemaware's King of Chicago, as well as Ports of Call, and a number of Sierra On-Line games).
The bigger problem apart from the ludicrous mismanagement at Commodore was that Commodore was too successful at marketing the low end Amiga's. The A500/500+ and then the A600/A1200 were the big sellers, not the higher end machines. These were price-sensitive buyers unwilling to spend money on graphics cards, because until first person shooters hit they had largely very little reason to.
This made graphics card a far more niche expansion on the Amiga, where they were a luxury, than on the PC where you increasingly had to buy one to run what you wanted.
It was first when first-person shooters hit that the planar graphics that had been a tremendous advantage with 2D games became a limitation overnight, that the Amiga suddenly was playing catch-up.
The problem wasn't inability to update the platform, but it was exacerbated strongly by a chronic culture of underivestment in R&D which had been a hallmark for Commodore all the way back to their calculator days that certainly made the updates happen far slower than engineering wanted. Commodore survived as long as it did because their engineers, and those of the companies they bought along the road, pulled off crazy feats when their back was to the wall, but they were frequently unable to get even minor funding to complete projects even during the periods were the company was pretty much printing money - Commodore went on death-marches when they were facing existential threats and were cash-strapped instead, and eventually it caught up with them.
The lack of open drivers is the primary reason why there are so few chromebooks with ARM processors. I don't see why so ma ny people think ARM is more open.
To be a bit pedantic, the 68000 was a mostly 16-bit implementation of a 32-bit ISA. The ALU is 16-bit (though it can be ganged together with an additional 16-bit shift register for 32-bit shifts) and most of the data paths are 16-bit as well.
It's a really smart design. Combines a forward-looking architecture with a realistic implementation for the time. It's a shame it died out while x86 survived.
It took too long to go superscalar, pipelining was a bit of an afterthought, they _should_ have done more with the Apple contract. Because they didn't this meant Apple moved on to PowerPC and the bottom fell out of the 68k market.
It's sad overall really. The Amiga was a great machine with a fun and easy-to-use/manipulate operating system. Memory protection (beyond weirdnesses like Mungwall etc.) would have been nice.
For such a small team though they did very well. Commodore were desperate for a slice of that DOS/Windows PC pie though and that further drained the coffers :(
Motorola just never learned how to sit still and continue to support and extend their platform. They got early to market with the 68000 (late 70s! way ahead of its time!) but it was a totally separate ISA and platform from the (quite awesome) 6809 8-bit chip. No continuity between them. As soon as the 68k looked a bit long in the tooth they started pushing the 88k, which was a total failure. Again with no continuity. Then they jumped to PowerPC, but they never even fully committed to that either.
If they'd done what Intel did and continued to develop on the same platform, maybe they'd still exist and we've have more diversity in platforms today. The 68000 was so much nicer to develop for than x86.
I'm not an expert, but as far as I understand, what killed 68k (and many 80s CISCs, like VAX), was the difficulty with producing an OoO implementation given the very complex instruction semantics, in particular the with indirect addressing modes.
It wouldn't be a problem today as designers have transistors to spare, but it was in the early '90s, when the high performance market was taken over by the simpler OoO RISCs and x86 [1] of which, against expectations, Intel managed to build a competitive OoO implementation in the form of the PentiumPro.
[1] which compared to other CISCs is much simpler.
They ended up solving this problem eventually with ColdFire. They dropped a few instructions and addressing modes and were able to produce something pretty performant. But just 15 years too late :-)
And now that's dead too. And yeah I think the engineers at Motorola in the 90s basically just saw the RISC writing on the wall and threw up their hands and said that was the way to go, customers be damned, meanwhile Intel just had too much invested in x86 CISC and couldn't do that and so was forced to make it work.
As a former Amiga user, I'm familiar with the Amiga's graphics co-processor(s). I actually forget now how many, I remember Copper and Blitter, but don't recall if those were chips, or functions on a single chip.
So, is this a sort of video adapter that converts the native Amiga video output to HDMI compatible signals, or is it full graphics card that brings Amiga graphics (32/64/4096 color, with acceleration) to modern output resolutions?
It is a graphics card in the sense of the "ReTargetable Graphics" (RTG) system. I ship drivers for the Picasso96 API (which also emulates the competing CyberGraphX API). All OS friendly Workbench/Intuition GUIs can then be used on a high resolution (up to 1280x720@60hz, 1920x1080@30hz) and color depths of 8 (Palette), 16 or 32 bit. A bunch of open source games like Doom, Abuse, ScummVM have been ported as system friendly applications, too, so these run fine. Old games that bang the hardware still go through the custom chipset of the Amiga and are output through the 15khz RGB connector, not via my card. But I'm currently working on an expansion that scan-doubles and upscales the classic video output, too.
(Edit:) On the hardware side, I implemented the Zorro bus protocol in the FPGA (in Verilog) and hooked it up to a SDRAM controller/arbitrator and DVI/HDMI encoder. There is also a simple blitter in the code.
Are you maybe thinking about adding more things on this one card? Ethernet, USB, memory (e.g. 1GB by default)...
Maybe even emulation of a faster CPU in FPGA (a'la the Vampire for A600).. though, it'd probably use the CPU slot instead of Z2/Z3.
Also, is it possible to hijack the function of the OCS/ECS/AGA directly via the Zorro slots (proxying the native output gfx output, e.g. with games), or maybe it'd require some kind of hardware bridge between the gfx chipset and your card?
Then later on there was the Enhanced Chip Set and the Advanced Graphics Architecture.
On that note, i think the A500 my parents got me back in the day were a oddity. I distinctly recall it having 1MB chip ram, suggesting it had the ECS inside. But it shipped with Workbench 1.x rather than 2.x.
That wasn't an oddity. It would have shipped with Workbench 1.3. I had one just like it with the Fatter Angus chip that could address up to 1MB of chip RAM, although you could still configure it as 512K chip and 512K fast, which is what I did.
I can't remember whether you did the 1MB chip config with a dip switch or by cutting a track on the motherboard - I think it may have been the latter, which is probably why I didn't do it.
I had a weird Amiga 500. It had late OCS chipset, but on hardware level its Denise could display bitplanes from slow RAM (512 kB RAM expansion)! Unfortunately I never tested blitter, copper lists, sample playback etc. from this memory range, so no idea whether those worked as well. Of course all is DMA access, so my guess is they would have.
RAM was mapped at 0x80000-0x100000 from chipset point of view. CPU saw same data at 0xc00000-0xc80000.
Amiga shipped with planar graphic modes, this card supplies chunky modes. Planar was great for ~10MHz CPU when moving 1-2 color images on the screen, but totally failed when technology moved on to 8-16bit color depth.
You could even argue Doom killed Amiga as a gaming machine.
Brilliant. Never owned an Amiga back in the day (constantly jealous of the superior graphics/sound) so I'm not familiar with the architecture, but it's impressive what you can do with even a low tier FPGA
Thanks! Yes, this was my first real FPGA project (good for learning!), I'm still exploring all the possibilities. For example, only around 25% of the LUTs of the FPGA are currently used, so there is even room to put a simple CPU in there, or simple shaders.
I bought a cheap-ish "Papilio Pro" FPGA board one day and wanted to do see if I could get it to output VGA directly (I did that with Atmel MCUs before for fun). So I installed Xilinx WebPack ISE and looked at various docs online. Other people's code (for example Mike Field's) was extremely helpful, too. I still feel I'd like to read a good book on Verilog design to close the gaps, if anyone is a pro I'm happy about recommendations.
I'm hopelessly inexperienced, but I want to do this as well. Getting USB (even for just keyboard and mouse emulation) and networking, in addition to high resolution video seems like it could be possible using one of the more powerful SoC's.
I don't see what the point of USB or networking would be; a NuBus Mac can use an Ethernet card perfectly well, and why would you want to use a USB keyboard or mouse with a classic Mac.
Really the big interfacing issue with a classic Mac is video, because many modern displays can no longer handle their sync frequencies. Either you need an older CRT (and to maintain it) or an older CFL-backed LCD (and to maintain it) that can hopefully show an image at a non-native size without upscaling.
With a custom digital video card and DVI/HDMI output you can actually hook up a modern display, just as with the card the OP created for Amiga. But unlike with Amiga, the classic Mac always used a straightforward framebuffer, so there's no software support issue.
There are other (non-DIY) USB-to-ADB projects out there taking shape, too.
But I'm also hoping someone starts making Nubus cards for VGA, USB mass storage, etc. It would just take an FPGA or perhaps even an overclocked teensy.
Not much; Zorro is basically the 68000 bus protocol, while NuBus is more general. But it's just a multiplexed address/data bus with a selectable 8/16/32-bit l word size for data transfers, it can be implemented with a simple state machine.
All you really need are the level shifting buffers just as with the Zorro interface, and enough I/O for the 50-some NuBus signals. That breaks down to 32 address/data lines, a few NuBus control lines, and some lines to control the level shifting buffers.
I find the open toolchains for Lattice a great development, but I don't have any experience with the ice40 devices yet. Do the Lattice targets have SERDES outputs, clock synthesizers?
Edit: I highly doubt that this project would be doable in an ice40 part (no SERDES, relatively small number of LUTs), but maybe an interesting challenge for the reader ;) Maybe with an extra transceiver IC and single bit depth, single zorro protocol...
Where did you find the Spartan 6 support? The page you linked is mostly about Lattice ice40. I see a link to a Spartan 6LX9 reverse engineering project at the end of the page though. I use an LX25.