The other day I was explaining the inner workings of a PC to my son as we were upgrading his computer. I wasn't sure what SATA stood for, so I looked it up: "Serial ATA", a successor to "Parallel ATA". So, OK, what's ATA? "AT Attachment". Great, so what's AT? This actually took a few clicks. Sure enough, it's "Advanced Technology" and comes from the IBM PC/AT released in 1984.
Maybe that's obvious, but for some reason that tickled me. Plugging in a wire to a 2TB solid state drive, named after a tech from three decades earlier? Awesome. I'm sure there's lots of terms that are direct descendents of the original PC (including x86 itself), but this one caught me. Imagine telling the guy who decided on "AT" that the term would still be used 36 years later.
Anyways, as crazy as it seems to mess with a CGA computer nowadays, it's all related. One tech leads to another which leads to another, each usually somewhat compatible or based on the generation before, so that oddities/limitations and workarounds from back then have a way of trickling down through the years.
My favourite example of this is /dev/tty, where "tty" means Teletype, a technology which predates computers entirely: four of the control characters in ASCII have existed, with their current numeric values, since 1901.
Personally my favorite has to be the "lp0 on fire" error message that stuck around until fairly recently. The origin of that message is from around the time of the early laser printers.
Laser printers have a small oven built in to melt the toner onto the page called a fuser. On modern printers that only gets turned on right before use. On the early models the warm up and cool down times were much longer so the fuser would both turn on earlier and take much longer to get back to a room temperature. Those things combined ment that a paper jam could very easily turn into a fire. In fact if the jam detection was having an off-day it could turn into a conveniently paper-stoked fire.
Of course there's no real way to detect if a paper jam is the fire causing kind or not remotely so any paper jam would tell you the printer is on fire, just in case it actually is.
In short: Early laser printers came with a fire extinguisher included in the sale price for a good reason.
Back in high school I used to spend way too much time playing with mode 13h. I used turbo pascal which allowed some inline assembly syntax [1], so one would initialize the 320x200x256color mode like:
BEGIN
asm
mov ax, 13h
int 10h
end;
END;
and put a pixel on the screen like:
Procedure Putpixel (X, Y : Integer; Col : Byte);
BEGIN
VgaMem [X + (Y * 320)] := Col;
END;
The palette was just 256 colors but you could pick which rgb colors to use [2], some effects would be based in palette cycling, others (fire!) would like nice by focusing in fewer hues.
8088 was the second cpu i coded assembler/machinecode on, and mode 13 was naturally my favorite mode until fatter graphic adapters came around, but let's not forget that this was actually achieved in mode 4, if my memory serves me well. Mind blown.
I find this kind of work interesting and important because, just like old CGA monitors, our own bodies are also a type of hardware that we have little ability to upgrade. Imagine finding hacks similar to this which would allow the human body to perceive things outside of what it's built to perceive.
> Late in his life, Claude Monet developed cataracts. As his lenses degraded, they blocked parts of the visible spectrum, and the colors he perceived grew muddy. Monet's cataracts left him struggling to paint; he complained to friends that he felt as if he saw everything in a fog. After years of failed treatments, he agreed at age 82 to have the lens of his left eye completely removed. Light could now stream through the opening unimpeded. Monet could now see familiar colors again. And he could also see colors he had never seen before. Monet began to see - and to paint - in ultraviolet.
I must admit I have never thought of Monet as a Cyborg Enhanced Superman - but that has changed today, thank you; the site itself also looks quite interesting!
I was born with cataracts in both eyes. One was removed when I was a toddler (including the lens), and the other (with better vision) was left alone until I was a teenager at which point I got an artificial lens implant. I can't see ultraviolet out of my eye without a lens as far as I can tell (or maybe I can and just never noticed).
The argument in favor of it is that Monet was an exceptional trained painter, and one specialized in colors on top of that. He might have been extra sensitive to colors compared to most humans (both in the sense of biological sensory sensitivity, as well as trained ability to perceive colors).
Color vision is encoded on the X-chromosome, right? And human tetrachromatocy depends on having two X-chromosomes with different color vision genes[0]. So wouldn't a hypothetical tetrachromatic male need to be an XX-chromosome male[1], and on top of that have one of the X-chromosomes carrying the deutanomaly, protanomaly or tritanomaly mutation, and on top of that have this mixture manifest itself as having four color receptors instead of three? Or is there another theoretical way that I'm missing?
If my educated guess is correct then "somewhat unlikely" seems like quite an understatement to me! :D
(I have protanomaly myself, and never considered that men could hypothetically be tetrachormats as well. This was a fun through exercise)
Myself, I've had ICL surgery (Implantable Collamer Lens) as vision correction in both eyes (I wasn't a good candidate for Lasik). With it includes 100% UVA/UVB blocking. It's basically a permanent contact lens implanted behind the iris and in front of the the natural lens. So, sadly, no ultrahuman UV vision for me. My only "bionic" ability is saving my natural lens and retina from UV exposure.
I dobt know if other people can see it looking at my eyes, but on certain low light conditions, I do see a subtle elongated 'X' sorta looks like a mostly transparent '>---<' at the bottom of my vision. From talking with my Doc, its a small side effect of how they fold the lense implant when they insert it to minimize the size of the incision in the cornea. I dont normally notice it, and since its on the peripheral, it is pretty easy to ignore.
It's just really really blurry. I can see colors fine, and if I zoom OSX all the way on my external screen, I can read text. If I only had that vision, I'd probably be able to move around in the world, ie walk to the store, but I'd have a hard time doing many things I take for granted.
Vernor Vinge's Deepness in the Sky features an alien race which, because they evolved from genetically engineered spiderlike creatures, have an interesting condition: in addition to many eyes their retinas also have a fantastic range of spectrum response from infrared to ultraviolet, so that their visual input feed exceeds their visual processing capability.
The hack is that their visual cortex is multiplexed in a way that certain colors or color combinations might be interpreted as feeling like seeing "plaid" or "tartan" even though the color is solid.
(This is unrelated to the article, but given that Vinge is one of my favorite authors I couldn't resist replying to this.)
It's been a few years since I read "Deepness", but I don't recall that the Spiders were ever referred-to as genetically-engineered. I'll definitely look into that.
re: the perception of color - I took that to be the Focused translators' interpretation of the spiders' vision, rather than a literal description of the spiders' cognitive process. I've always read the portions of the book that take place on the spiders' world as being a narrative created by the translators, in which the incomprehensibly-alien spiders are made more "relate-able" by analogizing alien concepts with human concepts.
I wouldn't do it myself, but I found the various compass implants pretty fascinating. They vibrate differently depending on what compass direction you're facing. Supposedly, after some time, you don't "think" about it, it just becomes more like an innate sixth sense.
Love it when that kind of discovery learning happens. Makes me feel excited about the fact that so much is still new to us, as people with our limited time and experience.
NTSC is a lot of fun to hack. I did a lot of projects similar to this one as a kid.
Basically tried artifact color on many machines.
The Tandy Color Computer 3 has a 640x200'ish mode with 4 colors. That one boils down to one byte per pixel, using composite video input.
Also, the CPU is a 6809! So much byte per pixel fun.
All this because of how NTSC works.
PAL, by the way, can be abused vertically to get additional colors. That can be seen in many Demoscene art productions.
I remember being utterly astonished at the realistic music and sound effects in (what I assume was) the original "Carmen Sandiego" video game. I would repeatedly exit and relaunch the game just to listen to the intro song.
When the local library got VGA monitors, us kids would look at the digital encyclopedia and those VGA pictures were like the future. Some encyclopedia pages even had VGA quality movie recordings... it was like the dawning of the future.
I got into some super serious trouble as a kid in elementary school for turning on the computer while a substitute teacher was there, and messing around with the built in encyclopedia. Even before we had reliable internet, it was amazing to me that there was this just wealth of information on this tiny little CD-ROM.
I wonder if that sense of wonder is lost in the modern world? It's such a joy to stumble on some fascinating new relic and descend however briefly into the depths of its madness. I hope we can still find new ways to inspire that joy now that most of the world's information is available through a device in your pocket. Which... looking back on the last two decades is kindof an astonishing leap.
Getting in trouble for looking around on the computer is also a memory I have from certain elementary school teachers. They must have been so paranoid that a kid would wreck the machine by touching it wrong... such expensive things, so hard to fix by their standards.
There were "digital" encyclopedias before Wikipedia? Using VGA graphics and tiny little CD-ROMs (700MB each IIRC, no double-layer tricks or anything) even? That's amazing.
Anyway, VGA graphics just means you don't need to store megabyte sized images since you can't display them anyway. 700MB is quite a lot if you're just trying to replace a paper encyclopedia.
I genuinely thought cga was only capable of black, white, cyan, and magenta. Most developers seemed to think the same. Whatever a handful of hackers and the people building the hardware knew, to most of the public that's what cga was. I would've been excited enough for green + brown mode.
Is there a live-capture of the audience reaction? They must have completely lost it when the 1k colors were revealed
EDIT: heh, they waited until the non-static images. Kind of makes sense - unless you have a really good feeling for the era this hardware is from it might not be immediately obvious how impressive of a technical feat this is from image output alone.
This mode was well known in the mid 80s. I wrote a shareware demo of the super color mode in masm that swirled and formed patterns. This had to be around 86 or 87. (That code is sitting on an old mfm disk that I can't read).
It was a fun mode, but too jaggy to do much with.
If you actually have diskettes from that era that you care about, you might want to image them using a low-level magnetic flux reader - the devices are actually quite inexpensive, and this would ensure that the data is preserved for the foreseeable future.
Well, he said an old MFM drive, so he's probably referring to an MFM hard drive rather than diskettes. That was the common parlance for MFM HDDs back then.
> you might want to image them using a low-level magnetic flux reader - the devices are actually quite inexpensive
Link? Googling that term shows at most probes that register a single scalar value, not something that can image a diskette. (Nor were those "quite inexpensive")
I wouldn't agree with "quite inexpensive" (start at $100~$200 and then you need a drive as well) but there are some of the options out there for low-level magnetic floppy imaging.
As an alternative to going hunting for hardware that will read your MFM disks, you might also have an idle look around for retrocomputing groups that are nearby(ish). Someone might live within a viable distance, or you might be able to bring the disks to a meetup. I understand sending disks off is also a thing, but given their age and the weakness of the magnetic fields in them I'd shy away from subjecting them to potential erasure via postal conditions. Unsure if this perspective is overparanoid.
How many colors were you able to get with moving graphics? From the article the CRTC hacking needed to get the last 512 colors is too CPU intensive for anything but static images on an 8088.
Neat! Old-timers may recall this composite pattern and inter-pixel artifacting was how Tandy Coco games achieved limited colours in PMODE 4. They've really taken this to the ultimate level.
It makes one wonder how much more we could squeeze from hardware from 20 years ago (PlayStation 2? Pentium 4?), or even from today! (And in this age of abstractions over abstractions over electron... well, there's a whole lot that could actually be squeezed)
It's too bad that modern day hardware does not allow similar depth of hacking decades after release. CDs could be still overburned to significantly larger capacity, yet there is nothing like that for Bluray.
"CDs could be still overburned to significantly larger capacity, yet there is nothing like that for Bluray."
Of course there is.
I have no idea what it is, and no interest in that particular technology, but without question there are edge cases and weird engineering loopholes and strange behaviors in any technology like that.
The reason they have reached this depth (1024 color CGA) is because of the decades they have spent hacking away at it - not because of some aspect unique to that platform.
... if you are at all interested in stories like this of bending hardware (in this case, the Atari 2600) to the will of the programmer.
(The Atari 2600 didn't have a framebuffer - you had to tell the electron beam what color to paint in real time, sync'd to the clock of the CPU. This was called "racing the beam")
The Commodore 64 and Apple //e had double hires mode by turning on two different video modes and switching between them to get more colors and resolution.
I still remember my first EGA card. I downloaded that picture of Smaug from a BBS. I loaded it up on my screen, stood at the other side of the room, and excitedly declared that from that distance it looked as good as a photo.
If I'd seen this with CGA, I'd probably be unable to function for a week.
Could somebody explain in simple words how can they achieve the 1024 colors with the CGA? Is this some kind of trickery? The article is too technical and demo-scene targeted for me...
If it is possible why there were no CGA games with so many colors? Thank you!
CGA supports a 16-color 160x100 graphics mode as a hack on the 16-color text mode. A few games used it, but not many. See http://tunneler.org/low-res/
In this mode, if you use a dithering pattern you can generate 256 (16x16) dithered colors.
When connected to a composite monitor, the dithering pattern appears as a solid color, giving you 256 unique solid colors at a resolution of 80x100.
In a composite video signal, different pixel positions produce different colors, so if you shift the dithering pattern over by one pixel, you get an additional 256 colors.
The author was able to find four such dithering patterns, for a total of 1024 (256x4) colors. However, the last two dithering patterns (the last 512 colors) require some low-level hacking scan-line hacking.
> Naturally, there are downsides: having to mess with the CRTC every couple of scanlines is quite taxing for the poor 4.77MHz 8088, so there's not much you can do with this other than static pictures. The 512-color variant, using only ASCII 0x55 and 0x13, doesn't suffer from this – it's basically "set and forget", requiring no more CPU intervention than any 80-column text mode (the familiar overhead of avoiding snow).
> Then, there's that other problem which plagues 80-column CGA on composite displays... the hardware bug that leads to bad hsync timing and missing color burst. There are ways to compensate for that, but none that reliably works with every monitor and capture device out there. This proved to be an enduring headache in calibrating, determining the actual colors, and obtaining a passable video capture of the entire demo... but that's all covered elsewhere.
Maybe that's obvious, but for some reason that tickled me. Plugging in a wire to a 2TB solid state drive, named after a tech from three decades earlier? Awesome. I'm sure there's lots of terms that are direct descendents of the original PC (including x86 itself), but this one caught me. Imagine telling the guy who decided on "AT" that the term would still be used 36 years later.
Anyways, as crazy as it seems to mess with a CGA computer nowadays, it's all related. One tech leads to another which leads to another, each usually somewhat compatible or based on the generation before, so that oddities/limitations and workarounds from back then have a way of trickling down through the years.