If I recall correctly you could run CGA and Hercules and the same time as they used different memory areas.
There were even a few debuggers that used this into the VGA era - program would be on the color monitor and your debugger on the black and white (actually usually orange or green).
Yes, originally CGA and MDA at the same time. Hercules was a non-IBM standard, based on MDA. Hercules has 64k of memory, where 32k is in the area of MDA, and the second 32k overlaps with the CGA area. You could disable the second 32k when you had a CGA card in your system.
EGA and VGA were also made backwards compatible, and could co-exist with an MDA or Hercules card.
The demo Stars by NoooN (https://www.pouet.net/prod.php?which=301) has an easter egg for a combination of VGA and MDA/Hercules: on the monochrome screen, a Snake game appears, which can be played via keyboard while the demo is running.
I did actually use this debugging technique while making a couple of the effects of Area 5150 (the lake effect at the end, and the one two effects before that).
I remember cracking Mech Warrior (2?) via flipping a bit on the executable (and changing the drive letter of a path somewhere), it was so amazing when it just worked!
I actually find it quite impressive what kind of things the IBM engineers thought of right from the start, given it was their very first PC ever.
Design a text and a graphics card, make them have different base addresses so you can have dual monitor. Or allow having an option ROM at a specific address that can hook into the boot process of the BIOS, adding additional boot sources, eg network boot (whatever you can cram into 32k).
It was also VGA and beyond -- and Hercules at the same time. And almost every debugger, including Microsoft CodeView, supported this mode. Most developers back in the late 80s and early 90s had a mono monitor on their PC, even into the Windows 3.1 and early Windows 95 days.
And because less color meant more resolution in the same amount of video RAM. If you were happy with monochrome CAD (no gray tones), you got 8 pixel in a single byte!
Does anyone know why cards did not map video memory into the C000-C800 or the D000-E000 segments? Did many machines have an option ROM or expanded memory there?
1992 Cirrus Logic GD5402/AVGA2 (ISA Bus) allowed you to enable special 128KB linear window directly mapping A0000-Bffff to VGA memory. Only ISA VGA chip fully supporting linear framebuffer access was ATI Mach64. Since ISA bus is limited to 16MB address space you had a choice of either no more than 12MB of ram, or finding a motherboard supporting "Memory Hole At 15M-16M" Bios option. VLB and PCI graphic cards mostly didnt have this problem (some mapped itself at 64MB assuming nobody would be crazy enough to cram that much ram into a computer in 1993). It popped once again at the end of AGP life/start of PCIE when we ran out of 32bit memory, this is why fitting 4GB of ram often resulted in 3.5GB usable at most.
Of course this problem is with us even today. PCIE Resizable BAR is a brand spanking new feature, first proposed in 2008, enabling expanding directly mapped buffer from previous limit of 256MB to full capacity of VRAM. AMD tried to repackage it under different name (Smart Access Memory) and upsell as exclusive to only highend brand new parts, Intel new lol GPUs require it to work reasonably fast, but lock out support on non Intel platforms.
Part of that was used for option ROMs like hard drive controllers, the EGA/VGA BIOS, network adapters, and for EMS; other areas were just marked by IBM as "reserved", which might have scared people off. In early machines there probably wasn't much of anything there, but then video memory needs were still modest enough that A000-BFFF sufficed.
Probably a good thing, because then those 'holes' could be used for UMBs...
The C000 segment was used for the EGA/VGA extension ROM. I'm guessing that using D000-EFFF would be unnecessary (because of the planar addressing squeezing 256kB of video memory into a 64kB address space), inconvenient (because the addresses wouldn't be contiguous - EGA and VGA were designed to coexist with either CGA or monochrome adapters in B000-BFFF) and (for VGA) insufficient - you'd still not have enough to map the entire 256kB of VRAM linearly. I also expect that IBM's engineers didn't want to take up all the extension ROM space because then it wouldn't be possible to add EMS cards, network cards, and whatever else ended up being mapped there. Though 192kB of write-only video memory in that space would be an interesting design!
Can’t wait to check this out on my IBM PC (5150) with CGA which is my first computer that I still have. (Some of the RAM is bad though so if it needs the full 640k like 8088MHz did I might need to repair it.)
Then again 8088MHz worked well for me because I lost the RGBI monitor for it at some point and that demo is designed for composite.
Getting files to the PC is an interesting challenge. Have to copy from Windows 10 or 7 to a USB drive and movie that to a PC I have with a serial port and Windows XP and then transfer the demo with Hyperterm in Windows and Telix in MS-DOS on the PC side via null modem.
If you can put USD 50-100 in your antics, you may add an XT-IDE ISA card with a Compact Flash adapter.
The one I have has a slot in the backside metal bracket: I can load a Compact Flash (1 GiB, FAT-32 formatted) in a modern PC with a USB adapter, then I insert the Compact Flash in the back of my IBM PC and switch it on with new fresh data and software.
Note that the Compact Flash is seen by DOS (6.2 in my case) as a hard disk drive, you may boot on it. It may also co-exist with an MFM harddrive: in my case, the MFM drive is C: and boots, the Compact Flash is D:.
It is a very convenient setup to make the IBM PC a smooth place in our modern IT world.
Interesting that this isn't based on lwIP. lwIP can operate happily in just 1 kilobyte of RAM, although lower RAM limits hurt performance quite heavily (can't reorder TCP packets, so you have to wait for the far end to retransmit them if they arrive out of order).
Adding to the other comments, there are also SD card/flash drive-based 'floppy disk' hardware emulators available on eBay. They even have buttons on the front so you can cycle through, e.g. all thirteen floppies of Windows 3.1. :)
There's one piece of hardware (3.5"
hardware emulator) whose clones are popular enough to even have third-party customizable firmware on github!
I own one, but unfortunately, I can't remember the name of this hardware nor the alternate firmware off the top of my head. When I get to a computer I will check my ebay history.
Even though it's physically a 3.5" 'drive,' with the IDC-style connector, the alternate firmware lets you emulate various disk geometries, and I believe the 3.5" and 5.25" signals are electrically compatible.
It's an extremely useful piece of hardware, especially if you're already getting software in the form of floppy disk images from a place like winworldpc.
I bought a random one online recently in order to configure an embedded device that requires a serial port, and it worked perfectly out of the box immediately. I used it with minicom and felt nostalgic!
I don't know about the 5150 in particular, but most of these USB to serial adapters will not work with older computers since they are expecting different voltage levels.
Just to clarify: it is not simply a matter of RS-232 (potentially) going up to 12 V. The low signal level is between -3 V and -12 V. If the adapter complies with RS-232 specifications in this respect, it should work with older hardware. If it doesn't, then there is a good chance it won't.
Unfortunately, many adapters advertise being RS-232 but the specifications go on to say they are TTL. (I'm not sure that's accurate either, but it is the vendor's shorthand for 0 V and 5 V levels.) Some won't say anything at all, which leaves compliance an open question.
Current limits usually make this Just Work: clamping diodes sink enough current to drop the voltage from 12V to 5V (or whatever) and the thresholds in the reverse direction are defined so that a 12V circuit still sees 5V as high if there isn't a terrible amount of noise.
Of course, this all depends on sane current limiting behavior from the 12V side. I'm sure there's at least one line driver out there that does a good "gnome swapping terminals on a car battery really fast" impression.
I wouldn't know if 'most of them' don't work, but I know for a fact that I have one (made by Vivanco, using a Prolific chip) that works fine in combination with my IBM 5160. I use it in combination with FastLynx, which still works even under Windows 11.
It's the only one I've tried, so I've not encountered any that didn't work.
The demo is about 500kB at the moment. We're hoping to fit the final version onto a single 360kB floppy but didn't quite get that working in time for the party. A 3.5" floppy should work fine.
Stunning. I don't really have a great pulse on what a 5150 is capable of; when I was born, the world already had 386s. That said, I do have a Tandy luggable that I believe is meant to be compatible, and while it doesn't pass the flight simulator test, it's definitely a good introduction to just how limited these machines really are. You can't expect decent mod playback, unless you're reenigne.
My condolences to emulator authors who emulate CGA. :)
To me it seems like PC emulators tend to use the wrong approach with CGA. It's a fixed-frequency standard, so in principle it shouldn't be all that different from what you see in a typical emulator for the C64, Amstrad CPC etc.: the frame is rendered at a fixed resolution and refresh rate, just as the monitor or TV would display it (overscan area included!); the size and positioning of the active raster within that area are determined by the timing parameters sent to the (emulated) display controller.
In an emulator like DOSBox, only the active raster is rendered, and it attempts to dynamically work out the H/V refresh and aspect ratio. Maybe for historical reasons, since it was originally meant for emulating VGA on a VGA monitor. Same goes for 86box, PCem and MAME... although at least the first two do somewhat better with this demo.
The PC world quickly became “the subset of features that work on PCs and clones” which means that shortcuts are often taken - both in clone development and emulators.
Whereas something like the NES had defined hardware that was mostly identical and so if some feature existed some game somewhere used it.
It's the lowest end of the IBM PC lineup: 4.77Mhz 8088 CPU, max 256KB RAM, CGA graphics (4 color graphics and 16 color text modes), PC speaker sound (no sound card). So, this is beyond incredible. IIRC, more than 4 color graphics modes are achieved by relying on pixel bleed with CGA's monitor signal output.
"Pixel bleed" sounds like it isn't really doing it justice. More than 4 colors on CGA are usually achieved by using the composite output, which is actually plain NTSC.
In NTSC, color is added to the monochrome signal by modulating a quadrature amplitude modulated signal onto a carrier whose frequency is actually within the monochrome image bandwidth itself. NTSC did this for compatibility with black and white TVs of the time. (If that sounds too complicated, it "just" means that changing brightness very fast creates color.)[1]
To get more than 4 colors in CGA, you "just" switch pixels fast enough that you are effectively modulating the color signal yourself! But the problem with CGA is that it doesn't have a graphics mode with high enough resolution to get all those colors. But text mode does, so there's tricks to chop the text lines into pieces which allow puzzling back together a high res image that modulates the color signal.
It's pretty cool.
[1] PAL is the same with a twist, SECAM is very different but "changing brightness very fast" is still accurate.
That was how the predecessor 8088 MPH worked.
Area 5150 however is aimed at a TTL RGBI monitor, and as such, does not use NTSC composite artifacting. It can only display 16 colours at once. So in this demo, you get 640x200 in 16 colours, where additional colours are implied by using dither patterns.
(Technically 8088 MPH did more or less the same, except these 'dither patterns' would result in NTSC artifact colours).
Thanks, I should have written "more than 16 colors", forgot that CGA has a 16 color mode without tricks.
Dither patterns/"artifact colors" are exactly what I meant, they achieve color by modulating the luminance signal (which is of course the same as "normal" color, but the trick is to let the software influence only the luminance and getting "unintended" colors instead of directly telling the CGA adapter to modulate the color on, and over a separate circuit probably, like what normal 4/16 color mode does).
Actually the RAM was expanded in this demo out to 640kb. This was necessary for some of the effects in combination with the loader. Such expansion boards were available back in the day.
That's fair; my understanding was that this was a pretty common thing to expand on a 5150, but I'd love to know if anyone has personal experience with having owned one back in the day.
Yes, especially somewhat later in the life of the 8088, most applications would demand quite a bit of memory, and the 640k configuration became standard for the later 5160s and most clones.
So I wouldn't be surprised if 5150 owners also upgraded the memory on their machines.
The AST Six Pak Plus card was quite a popular upgrade.
It could add up to 384k, so if you had 256k on board (fully loaded later model 5150), there's your 640k.
If you were unlucky enough to have an early model 5150, the board could only take 64k. That limit was however because 4116 memory chips of 16kbit capacity were used. The board could be modified to take the later 4164 chips of 64kbit capacity, in which case it would take 256k.
Alternatively, you could use two memory expansion cards to bring the memory up from 64k onboard to 640k total. Although I doubt many people would use such a contraption in practice.
Early model 5150s also required a BIOS upgrade, as a bug would prevent the BIOS from detecting more than 544k of installed memory.
So in theory it is possible to have a 5150 with 640k, and no doubt such machines exist in the wild. But it's probably far more common to find a 5155 or 5160 with 640k. So just like with 8088 MPH I guess that's the 'practical' target of the demo.
I still have my PC (that my mom bought for our family, used, in 1987). It has an AST SixPakPlus which expands memory up to 640KiB, game port (analog joystick and digital buttons), parallel port, serial port, real-time clock with battery (if you didn't have that you'd have have DOS ask you for the time and date every time you booted up).
I think it came with software to help you use all that RAM, like a print spooler and a RAM drive. The 1987 version of the manual talks about the software on page ix [1].
(I also noticed the manual mentions you can enable/disable parity checking. I wonder if that would let me leave 1/9 of the chips out since I have a few bad chips and have to use less than 640KiB currently. I should just try to source some replacement chips though.)
Fun fact: The 8-Bit Guy (from YouTube) used to work at AST.
Technically it all runs in 500kb I think (not sure if this includes DOS). And this was intentional because the guys had in mind what people would typically have available.
However I think everyone is going to have to temper expectations regarding what this will run on. Even my effects (the 3D Glenz objects) couldn’t be debugged on an emulator even though it’s a fairly simple tricked up video mode and a bunch of fairly ordinary assembly code doing VRAM writes that are not timing critical for the screaming fast 3D.
There's plenty of YouTube videos about it, search for something like "commander keen cga" or "xt games cga" to get a feel for what it usually looked like.
I'm a little older and spent a nontrivial amount of time on CGA, so this demo absolutely blew my mind.
I am trying to be a bit humble since my experience is definitely limited, but to be clear, I do have some experience with an XT clone at least. Still, I haven't written very much code targeting an XT machine, and especially, I have never mucked with CGA beyond BIOS routines. I've got to imagine that even Commander Keen had to be doing some pretty hot tricks, even though the only novel technique I'm aware of is the EGA adaptive tile refresh one, not anything CGA.
Of course, it's never too late to write some wares for XT... I'd prefer to get an actual XT for that, though. I'd fear that if I only ever tested on emulators or clone machines, I would end up writing code that doesn't work on any actual XT :)
Is dosbox being used as an example, or was that actually the emulator? I find this intriguing. I would've assumed PCem or MESS was a better choice for writing a demo for something like an XT computer. Dosbox is useful but it definitely hits me as something that is emulating a software library per se, and only the hardware as an accident of that, a valid approach that is a lot less useful for programmers working under it...
I used the DOSBox debugger to debug several of the effects during the making of Area 5150, but other than that I used real hardware. 86Box is probably the most accurate one at the moment but still isn't accurate enough to run the entire demo correctly.
Given that the demo targets AT-class hardware, it is unlikely that something like PCem or MESS was used, because in those cases you'd actually have to select an AT-configuration to make it work, instead of an XT-configuration.
DOSBox does not have this option, it always emulates AT-class hardware, in which case this demo 'just works'.
Really stunning! I was impressed by the whole demo but one thing I wasn't expecting was the sample-based audio in the track played during the end titles. Another thing that I thought was impressive was the (pseudo?) 3D hills section.
Thanks! The 3D hills are no less 3D than any other 3D scene displayed on a 2D monitor. It's a heightmap (of a real place) rendered with correct perspective.
Getting the samples to output at the correct time during the end titles required a painstaking amount of cycle-counting, measuring, and iteration.
If you have a "stock" IBM PC, just the standard dual 360K floppy drives without a hard drive, it's actually possible to run this demo!
Use DOS 3.30 and FORMAT /S a blank floppy, copy APPEND.EXE to the floppy, and copy as many of the AREA5150 demo files as will fit. Then copy the remaining files to a second floppy. Put first floppy in A: and boot off it, put second floppy in B:, and then run APPEND B: followed by AREA5150. Behold, it works!
You absolutely do need the full 640K of RAM though, refuses to run with only 512K.
If that exceptional talent and art-code were distributed in the 1980's, obvioulsly they would have had a large effect in many extends (inspiration to others on the platform, definition of new compatibility levels...), but what about distribution?
I feel the demo would have had very limited distribution due to the very restricted platform target: real IBM 5150/5155/5160 with original IBM CGA.
I think people of the time would probably have tried to make it work on a larger base since maybe 80% of XTs were not IBM's...
My question: what video modes and tricks are solid across most XTs and CGA cards of the era? I guess the 640x200 text mode with 8x2 chars is solid, but what about the other tricks? How portable are they?
Why not making a selection of 3 or 4 very popular and very different hardwares of the time as a target platform. For instance I think of IBM 5150 CGA + Tandy 1000 + Amstrad PC1512...
Most of the tricks in the demo should work unmodified (or could be made to work) on most CGA implementations of the era. Some clones had slightly different font ROMs so it would look a little bit wrong on those. The final lake effect (and the ripply picture shortly before it) use very tight cycle counting so probably won't work on anything except a genuine IBM PC/XT and CGA. Some effects (like the radial fire effect and the voxel landscape) should work on just about anything. The Amstrad PC1512 will have trouble with any effect that modifies the CRTC timing registers as it doesn't have a fully programmable CRTC and always generates a 15.7kHz/59.92Hz 640x200 image. I don't have personal experience with the Tandy 1000 and don't know how compatible it is off the top of my head.
Thank you for your reply, it is conforting to know the tricks are solid. Maybe it will help some retroprogrammed CGA games to appear with more colors and effects.
I have a VGA XT clone (4.77/8 MHz) and an ATI Small Wonder CGA clone board. I will try to bind them together and run Area 5150.
The ATi Small Wonder is not entirely cycle-accurate. When I studied the CGADEMO by Codeblasters, I found that the scroller didn't look entirely correct, because the Small Wonder was slightly slower: https://scalibq.wordpress.com/2014/11/22/cgademo-by-codeblas...
It appears to insert a few extra waitstates compared to an IBM card.
This may trip up some effects in Area 5150.
I wonder... certain software, such as Flight Simulator and Lotus 1-2-3 had a profound effect on the level of compatibility of clones.
So perhaps if software was released that demanded EVEN MORE compatibility, we would have even more compatible clones than we do now?
Yes, the 'Macrocom Method' section shows how it's done over the RGBI output, as used in this demo - the sample artwork shown there is in fact a much earlier version of one of the still images in Area 5150. :-)
In short, textmode gives you 80x25 characters of 8x8 pixel resolution each, making 640x200 resolution, with 16-colours, where each character can be assigned a foreground and background colour.
The 'Macrocom Method' reprograms the character height from 8 rows to 2 rows, which effectively gives you 80x100 characters. You can then use each of the 256 characters in the characterset as an 8x2 'glyph' with a foreground and background colour to build your images from.
So you can't technically control the colour of every pixel on screen, but you can make a decent approximation with well-chosen character 'glyphs' and foreground/background colours.
This can also be automated with pattern-matching, where you convert an image to glyphs by algorithmically selecting the nearest match.
While this is cool, I think it would be fun to try, for example, pushing a 386/VGA to its limits as well (then with an SB card, of course). Or a later computer
Or even take a modern computer, use an fb driver and just make the best demo you can. No 3D graphics, just a framebuffer
My intuition is that "limits" can be moved by some exponential exertion of mental effort.. Limits are not exactly known and hard when it comes to this stuff. This is probably the current known limit, but there's no guarantee that someone else won't beat it in 20 years ;)
I guess as capacity of the machine increase, so does the exponent on the required effort to move limits beyond the obvious.
In some way this demo demonstrates that.. The Commodore64 was/is ahead of the 5150 in the demoscene stuff in multiple ways, for instance, it has a more capable soundchip, (I have a hard time imagening better sound coming out of a pc speaker, but again, this is an "obvious" limit, and there's no saying that someone can't hack the pc speaker to do cooler stuff than what's currently being done on the SID chip), other "obvious" advantages of C64 is that the VIC has sprites and scroll registers, but this demo blows some of that out of the water..
In other words, there will be the "obvious" solutions that anyone can see (and their related limits).. More advanced thinking leads to less obvious solutions (moving the known limit), and even more advanced thinking leads to even less obvious solutions (and so again, moves the known limit).
Demos like Area 5150 are about pushing the hardware beyond what people thought was possible. If you were to show it to
one of the 5150 engineers back in the day, they would probably struggle to explain how it was done.
There is definitely a place for the types of demos that you are talking about. Some of them are being made, though the introduction of artificial constraints is common (e.g. the size is limited to 64 kB or 4 kB). There is even value in creating the best demo you can, even if it doesn't impress the demoscene, simply because showing what you can do will likely exhibit skills beyond those of a run-of-the-mill developer.
I like to think that the original 5150 engineers would get it, but probably won't like it. "What in the nether hells are you doing with our respectable Business Machine?! Go play with an Atari!" ;-)
The demo gods have done it again. Wow. I wish they would explain more about how they're managing to blend colors using shade blocks. Because doing that on this scale either requires an extremely advanced knowledge of physics, or the fanatical dedication of a lonely artist.
It doesn't really - for this particular technique an ANSI-art editor should do (and was in fact used for almost all graphics in this demo, to one degree or another). :)
I suppose there's a wee bit of physics if you want to fine-tune the results for a particular CRT monitor... not that we've really done much of that for the party-version, but there's some research on the original IBM 5153 CGA monitor at https://int10h.org/blog/2022/06/ibm-5153-color-true-cga-pale....
It is hard. Anyone can mix colors by hand in an ANSI art editor. However this demo is huge and certainly appears to be programmatically generated at times. It's very hard to program a computer to blend colors using CP-437 shade blocks, because you're not actually linearly interpolating two RGB values like you would normally in graphics programming. The blending happens in physical space, hence physics. Read the section about gradients in the link you provided. They say they tuned the gradients, but they didn't explain how they did it. For example, was it just spending countless hours eyeballing color combinations until it looked nice? Or did they use a formula like CIELAB ΔE* to predict how background / foreground / shade block combinations would be perceived? https://en.wikipedia.org/wiki/Color_difference#CIELAB_%CE%94...* Those formulas are much more complicated than the RGB((r2-r1)/2,(g2-g1)/2,(b2-b1)/2) blending that computers use in software (which just punts the problem to the chosen color space). Notice also how CIE needed to keep revising the formulas over the decades since color isn't something we entirely understand yet.
Oh I certainly won't say that it was easy, mind you. The artwork was just done over a long, long period of time. As mentioned elsewhere in the comments, my 8088 MPH blog post from 2015 already shows an early version of one of the images (and a couple are even older). :)
Our toolset for this demo did include an image converter (CGAArt), which can in fact use several version of the CIELAB formulas for its metric, among others. That's by reenigne from our team, who has commented here so I'll let him elaborate on it if he wishes. Personally when I do this sort of artwork, I prefer to tailor it by hand to the target video mode; as you noted, certain parts were indeed converted programmatically, but a lot of that was down to time constraints prior to the party release. In the final version, I plan to rework/redo those.
That's amazing. I would love to see a blog post for instance talking more about the math that went into taming the unique visual blending properties of the IBM PC! Because the way you all hacked shade blocks is just as impressive as all the hacks you put into things like the crt and ram. Most importantly, the shade blocks are readily available for creative applications today thanks to unicode.
The programmatic transitions between still images use linear RGB space, which is the correct way to interpolate between two colours. The maths behind it is pretty simple - essentially just reversing the gamma correction from normal (0-255) sRGB space before the interpolation and redoing it for the final colour (no need to get into the hairy areas of LAB or perceptual colour spaces for this). Once we know the colour we want, we choose the character (from a list of 6) and attribute which most closely matches that colour. Of course, it's all done using lots of lookup tables so that we can process several hundred character cells per frame.
My point is that sRGB linear interpolation does not accurately model the blending of color that happens in nature, which is something that should happen with CP-437 shade blocks. Here's a web page I set up so you can see for yourself. https://justine.lol/color.html Even within the context of sRGB itself, it's not "correct" at all. For example, yellow and blue should make pink. But sRGB predicts it as being grey. Depending on how good your monitor is, the shade blocks should make pink, but even that is usually thwarted somewhat by subpixel layout. For example, if you drag the window around on an LCD, the coloring of the shade blocks will change weirdly as subpixels move. Subpixel layout shouldn't be an issue with IBM-PC CRT monitors. I wish I had one to confirm what it does. But in general this is just one of the weaknesses of the sRGB colorspace. But if you use something like CIELAB then it does predict pink. But that space has weaknesses of its own. But those problems are nowhere near as challenging as shade blocks, because linear interpolation just chooses a new color and doesn't need to predict anything. To choose colors for shade blocks you're applying your model to predict a more natural phenomenon.
There's a difference between linear interpolation of sRGB values (what you're doing on your page) and linear interpolation of linear RGB values (which gives better results). This is the explanation that made it all fall into place for me: http://www.ericbrasseur.org/gamma.html?i=1 . The linear RGB colour space is also a linear transformation of the CIELAB space so doing the interpolation in this space is equivalent to transforming to CIELAB space, doing the interpolation there, and then transforming back.
However, for the purposes of Area 5150 I think the differences between sRGB interpolation and linear RGB interpolation would have been too subtle to notice since there are only 6 * 16 * 16 = 1536 dithered colour/pattern combinations to choose from in the first place - the error introduced by that quantisation is likely larger than the sRGB vs. linear RGB difference. But I used linear RGB anyway, just to be correct about it.
I agree that gamma correction helps, but we're not talking about gamma, we're talking about color. Here's a version of the page I shared earlier with ITU REC BT.701 gamma correction applied: https://justine.lol/color-gamma.html It still has the same color problems. In fact they should be easier to see now that the white/dark issue is fixed.
Could you explain how linear interpolation is different from sRGB interpolation? I would have thought they were the same thing. If by sRGB you mean interpolating but being lazy about gamma, I'll be the first to admit that's just plain old incorrect, even though laziness is sometimes a virtue.
Also are you one of the demo authors? If so we could probably move this conversation to Discord or email and we could try some more blending methods!
Yes, I'm reenigne from the demo. Feel free to contact me at andrew@reenigne.org.
I'm not sure what you're doing with the colour mixing on that page but I'm wondering if you're just applying a gamma curve to the mixed result. This is what I meant:
The real gamma correction formula is actually slightly more complicated than that because it's linear up until the sRGB value is about 10 then then follows a ^2.4 curve but the difference is too small to notice.
Ah, I think I get the original question now - you referred to the dynamic blending during the fade transitions, not to the static dithering used in the actual image data. If so, disregard my previous reply (see ajenner's instead). :)
Area 5150 is a great double entendre. Area 51, well, aliens and all of that, and 5150 is the involuntary commitment code on California for mental holds.
Correct, as "XT" stands for eXtended Technology.
You can't extend it without having the basic technology first. The 5150, aka the PC, was that basic technology.
However, in terms of CPU, graphics, sound and all other aspects relevant to this demo, the 5150 and 5160 are interchangeable.
The main differences between 5150 and 5160 are:
5160 upgraded from 5 to 8 ISA slots (also changing the size of the bracket, which has been the standard ever since, even today with PCI-e 5.0).
5160 removed the cassette port.
5160 came with a harddisk as standard (and as a result, a more powerful PSU, as the stock PSU of a 5150 was insufficient for a HDD, and generally required an upgrade).
Memory-wise, there are various different revisions of 5150 and 5160 boards, which can take different amounts of memory onboard.
For the 5150, the early revisions took 16k-64k, the later revisions took 64k-256k.
For the 5160, the early revisions took 64k-256k, the later revisions took 256k-640k.
In all cases it is possible to add additional memory up to 640k via an ISA card.
So it is possible to have a 5150 configuration with the full 640k that the most high-spec 5160 had.
Then there is also the IBM 5155, the 'portable' PC. This is essentially a 5160 motherboard in a modified case, with an integrated CRT. As such, it is also fully compatible with the 5150 and 5160, and can run these demos flawlessly.
Not on the original PC/XT machines, as the ISA bus was basically the CPU data bus. So regardless of whether you had memory on board or on an ISA card, data transfers were exactly the same speed. On later machines, the CPU would be clocked faster than the ISA bus, so this was no longer true.
As I remember it, the term "XT class" was used back in the day to distinguish between "AT class" PCs and those prior - i.e. to mean an IBM PC, XT or comparable machine. So (weirdly enough) the 5150 was considered XT class despite predating the XT (they're almost identical as far as software is concerned anyway). The term "PC class" wasn't a thing because PC came to be shorthand for any IBM PC/XT/AT or later x86 machine. Some more powerful machines like the Amstrad PC1512 (with an 8MHz 8086) were also considered XT class - these machines could run games like Bruce Lee and Digger which were designed for the PC/XT (and which were too fast on AT and later machines), though the gameplay was quicker than they were designed for so were extra-challenging.
Yeah, that seemed to be the usage in most contemporary literature/magazines. There are also those other architectural factors like the bus width, the number of PICs, the keyboard subsystem and so on, which is why you could have "XT-class" 8086 and even 286 machines (like a Tandy 1000 model or two).
What may also have had an impact was that clones were generally XT-clones (using the newer, smaller ISA card layout, and no cassette port), not PC-clones. And indeed, they were specifically advertised as 'XT-clone', 'XT-class' and such.
So the 5150 is really the oddball here.
Back in the old days I understood the usage as follows: 'PC' was a catch-all term for all IBM PC-compatible machines that ran DOS. 'XT' was the term used for machines with an 8088 (or sometimes V20) CPU (and of course there was the 'Turbo XT' subclass for CPUs running at more than 4.77 MHz). 'AT' was the term for 286 machines.
After that era, people just identified PCs by the CPU used, so you had '386es', '486es', 'Pentiums' etc.
The 5160 barely had any improvements. Except for extra slots and including a hard drive, it was essentially the same. The AT had real performance improvements. I remember using one in 1987 or so at my dad’s office.
There were even a few debuggers that used this into the VGA era - program would be on the color monitor and your debugger on the black and white (actually usually orange or green).