I was the lead and pretty much only programmer for the Canopus Pure3D. Ours had 12 RAM chips for 6 MB of memory. 2 MB extra Woo hoo! We also featured a Chrontel chip for composite and S-Video output. I wrote our DirectX driver (performance enhancements) and other sundry software for the Pure3D (pictured) and Pure3D II. Pretty cool to think back on those days.
Would love to know more.
I recollect the Pure3D being by far the best V1 card. Because of this many gamers patiently waited for the Canopus version of the V2, expecting it to be a similarly significant leap up from the other OEM cards. IIRC it took ages to come out, and was barely different functionally (just smaller and with some kind of LED to indicate it was switched)? And people were pretty annoyed to have waited all that time not joining the V2 party :)
I owned one of these and it was wonderful! Wild to find someone who worked on it. I needed it because I needed EverQuest to be high res, and couldn’t do SLI (not enough free pci slots!) so the Pure3D nailed it for me.
It’s crazy how fast stuff moved back then. Voodoo 1, to Voodoo 2 (both hits) to Voodoo 3 (meh), to Voodoo 4/5 (debacle), to bankruptcy was less than 4 years. In the same timeframe, we went from a 200 MHz Pentium MMX to a 1 GHz Pentium 3.
What’s happened since 2015, by contrast? Many people still swear by their 2015 MBPs, the last model before the touchbar debacle. That Haswell based machine was getting long in the tooth even back then.
Eh, outside of desktop CPUs things have continued to move pretty quickly.
Looking at GPUs, the Voodoo3 came out in 1999 and had a fillrate of around 150 MPixels/sec. Today's GPUs have a fill rate of around 300 times that, which gives an annualized increase of over 30%, for 20 years. This doesn't even consider the fact that GPUs went from highly specialized parts to general purpose compute chips.
Lower power CPUs are also impressive. 20 years ago the state of the art was the DEC/Intel StrongARM. A modern smartphone SOC is around 50x faster than that per watt. If you look at floating point stuff it's more like 1000x faster.
As manufacturers continue to make increasingly questionable design choices, it's natural that users will become hesitant. You can get a possibly not-that-much-faster system, at the risk of losing functionality that has worked for a long time.
In the same timeframe, we went from a 200 MHz Pentium MMX to a 1 GHz Pentium 3.
...and most if not all of your existing hardware and software would continue to work unchanged, in addition you got new features. I remember going from a 233MHz Pentium to a 2.4GHz P4 and pretty much everything was the same, just faster. The situation isn't so clear now.
To summarise, with the 90s to early 2000s each upgrade really was an upgrade. Now, not so much.
The only time I really notice the difference from SSD to M.2 NVME is with a node project build, or running certain background services that are disk heavy... General use, not nearly so much.
I'm not sure what you have in mind... generally speaking Windows software keeps working. PCs aren't speeding up much anymore, but what functionality is lost? I can still run (and i do run) software from the mid and late 90s on the Windows 10 PC i bought four months ago.
On windows 10, I was unable to install "Flight Simulator 2004" that I bought in 2007. Have you tried to install Word 6 (mid 90s) or visual studio 1.52 on windows 10 ?
There's plenty of good reasons to bash Microsoft, but 'lack of backwards compatibility' just isn't one of them. If anything, I think their obsession with backwards compatibility drove a lot of compromised design decisions at that company.
History of Microsoft is very long. Backwards compatibility was excellent until windows XP (included). This focus on compatibility has largely disappeared in more recent versions. I am not bashing Microsoft. I love visual studio code. I love the fact that I can edit office documents (powerpoint and word) for free using chrome on ubuntu. Maybe the work Microsoft is doing on Windows Subsystem for Linux will make me come back to windows one day.
It seems like you're talking about the fact that they dropped 16-bit compatibility in 64-bit editions. XP was the first consumer version of Windows with 64-bit support so that timeline would fit. There were various significant technical challenges which caused them to drop 16-bit support in 64-bit Windows and I don't think it's fair to imply that they lost their focus on backwards compatibility just because of that one regression.
There were various significant technical challenges which caused them to drop 16-bit support in 64-bit Windows
That's the official answer. The more nuanced and correct answer is that DOS (16-bit realmode) apps won't work due to the hardware lacking the functionality --- blame AMD, not Microsoft, for that --- but there's no such limitation for Win16 (16-bit protected mode) apps. Thus, WINE on a 64-bit Linux will handle 64, 32, and 16-bit Windows apps, and there's no emulation unlike with DOSbox and such:
To add even more nuance, even protected mode 16-bit Windows ran Windows apps in v86 mode. Essentially protected mode was a hypervisor you could launch from DOS and your actual Windows session would run on top of that. Before the VT-x extensions to x86-64, there was no more access to virtual-86 if you were running the processor in 64-bit long mode.
Considering that, it's no surprise that 16-bit Windows apps would no longer run on 64-bit Windows, especially since NTVDM was necessary to run them on 32-bit NT based versions of Windows anyway.
Why do you think backwards compatibility dropped? Just recently they decided to not build newer versions of .NET Framework on top of .NET Core (even though they have open sourced practically everything on .NET) to ensure that older programs built on .NET Framework will keep working and wont face issues from any incompatibility with .NET Core.
IMO they are making their life a lot harder by introducing all those frameworks and platforms every five years or so, but despite that they still make sure that stuff keeps working.
16bit Windows software will not work natively on 64bit Windows. However this isn't a recent change, this is happening since the introduction of 64bit Windows and consumers started getting 64bit Windows already for more than a decade ago.
There are solutions, however.
For 16bit Windows software you can try https://github.com/otya128/winevdm which provides 16bit emulation and pass-through of Windows API calls to a mix of Wine and Win32 native calls. It isn't perfect but it can run a lot of Windows 3.x software. You can either drag-drop the executable on the emulator's exe file or use the supplied registry files to install it system-wide.
DOS software tends to work fine under DOSBox or vDos (a DOSBox fork explicitly made for text-based DOS applications).
Now i'm not saying you cannot find cases where stuff do not work, but in my experience most stuff work out of the box or with little tweaks.
> and most if not all of your existing hardware and software would continue to work unchanged
It’s funny how backwards compatibility back then allowed us to have massive progress, by making gradual steps forward possible.
These days, where we are seeing less and less real progress, we now hear many programmers claiming we have to sacrifice and shed backwards compatibility if we want further progress. (I’m looking at you Apple and Google!)
Something is very very backwards. And yes, an upgrade may no longer be an actual upgrade.
I think this is partly because things have gotten vastly more complex & abstract, and none of that complexity and abstraction has been managed well. We basically always say "Yes right away" to increased complexity/abstraction these days when PM wants something.
Tech debt was lower back then.. maybe all of our practices were worse, but the code bases were thousands of times smaller which made up for a lot.
Heck Agile hadn't been invented yet back then. You got requirements and got to design the core technology first. There was no such thing as an MVP hack and all of a sudden you had a back end that had millions of lines of code and 10 layers of abstraction that was all built on a bad design.
It's an industry-wide Duke Nukem Forever situation.
If you keep re-laying your foundation, then you're also going to have to keep re-building whatever was built on top of it, and that's just a huge time suck.
We have accomplished the ideal of "wei wu wei" to an exquisite degree, but, unfortunately, we got our wei and our wei mixed up.
In more recent years the same thing happened in mobile phones and tablets, and then watches.
In 5-year spans they all had 10x increase in power, from toy single-core CPUs and GPUs that barely moved pixels on screen, to proper multi-core CPUs and GPUs that would rival your laptop of only they had more thermal headroom.
Mobile phones still come with pretty limited operating systems though. I can't seriously shift my local home server [running on a beat up old 4GB ram intel computer, with various kinds of heavy duty services] to my 8GB mid-upper range android phone, even with turmux.
Also, I can't turn my stock mobile into a good desktop like environment, given I have a big screen. I work with LLVM and Haskell, with Emacs or VSCode [lately], with some Blender3D mixed in.
My phone was not built for this. It is a purely consumer device.
The best/worst example of buyer's remorse I've ever had was saving up all summer to buy a Voodoo 3, having to make a deal with my parents because I still didn't have enough, only to realize entirely too late that by the time I had gotten one it was already outdated.
Honestly you could go all the way back to 2010 or so. Zen, Zen+, and especially Zen 2 are the most exciting things to happen since Sandy Bridge.
Intel fucked up big time, not only did they let moore's law die, but they repeatedly suppresed one of the only companies capable of carrying the torch further
Edit: before anyone event thinks of trying to reply defending intel, I suggest you look into the increasingly common reports of internal hubris being the core cause of many of their failures. They failed to realize this themselves until they tried pitching customers features that competetors had already sold and were shipping!!
Nothing. Staff moved fast on a PC market, which was affected by the transition from PC being a mere toys to serious devices.
These voodoo boards were a mere joke in comparison to what SGI could provide, so the PC market was catching up, hence it moved fast through the well trodden path.
I mean, all these voodoo and nvidia guys already worked for SGI or similar companies and just brought to the PC market the technologies which were developed for previous decade.
Having used SGIs in the 1990s they didn't have any power at all until you spent many many many thousands of dollars.
The ones that were impressive were the ones that might have been $20k+.
At the time the Voodoo came out the run of the mill SGI workstations did not come with any of the advanced 3D graphics capabilites. Certainly nothing that blew you away. They did stuff like wireframe or unshaded 3D CAD and stuff like that, they had working OpenGL well before a PC did.
But the stuff that blew you away was stuff like the Onyx. Those blew me away when I saw them in the mid 1990s when SGI came to recruit where I went to school. They were $100k+ a year or two before the Voodoo came out.
There were other things they did that was way ahead of their time. I had an Indy for my workstation in 1996 at an internship. It had a webcam that could shoot stills or videos and do some basic video chat. In 1996. People used to fill up the hard drives with video cause hard drives were still so small. Irix had a really nice X Windows UI in the mid 1990s when Linux and the other commercial Unix variants had terrible UIs.
SGI was good for stuff it was made for, which was visualization and CAD. Which, in that time, meant no texturing - that one meant either an expensive hardware option, or software fallback.
Speed-wise, 3dfx decimated all but the high-end boards at the time. GLiNT and Intergraphs low-end ards were still $3000-$5000
Those were 'real, workstation-level cards' with accurate rendering and all the OpenGL features.
On the highest end were the Intergraph realizm cards which were 2X the price. Then you had the SGI systems that started out at 2x times the price of a loaded Integraph Workstation...
At least the newest better 4 core Macbook Pro 13" are now beating the 2015 15" MBP in raw CPU power. I'm looking at (only) CPU benchmarks the recent years and for me it seems Intel was pretty much sleeping from around 2013-2017... maybe it was also their internal problems with going to 11nm and also lack of competition!
I think Intel could have stepped up their no. of cores with low thermal CPUs a lot sooner. The last radical step up from Intel was 2013 with Haswell CPUs and then after that 2018 with a more lower TDP 4 core CPU which we see now on the Macbook Pro 13".
I just got a 2 core Macbook Pro 13, because that's the only fast mac you can get today without a touchbar. I'm just not sufficiently pragmatic to spend any money on one of those horrible touch screen keyboards.
> What’s happened since 2015, by contrast? Many people still swear by their 2015 MBPs, the last model before the touchbar debacle. That Haswell based machine was getting long in the tooth even back then.
Moore's law broke. GHz and those low hanging fruits are gone. I'm using a 2013 MBP. It's a core race now but lots of legacy software that aren't coded for multicore.
Moore's law still applies if you plot computing power/dollar over time (normalized to purchase power). It's just that all the progress is nowadays in parallel computing chips/GPUs.
Here's the progress over 120 years, and spoiler ... 17 orders of magnitude in computing power:
Yes for sure.. I had a 75mhz machine in 1995-1996, got a 200mhz machine in 1997, and then built a 1Ghz machine in 1999.
It was an absolutely wild ride! Each machine back then was dramatically dramatically faster than the previous one. It wasn't like today where you'd need a stopwatch to tell. You went from being unable to do something to it being possible.
By 2000 we had most of what we have today other than tons of video available everywhere. In 1990 we were still using DOS and character cell graphics most places. No CD ROM, 10mhz machines, no connectivity other than modems. Lots of people still had monochrome displays in 1990, memory was 640k/1MB/2MB in the early 1990s, hard drives were < 100MB. My 2000 we had 1Ghz machines, DVD/Blu-Ray drives, 1GB of memory, drives were into > 100GB, etc.. 3D graphics was a thing, good video capabilities, high res desktops.
The obvious comparison to the GPU arms race is the VR hardware arms race. Same number of companies rising and falling in about the same period. Carmack is at the forefront of both, too.
You cannot be serious. VR headsets have not been evolving at the pace of hardware back then. The resolution of headsets does not double every year or something and VR today is not much better than one year ago. Nothing comparable.
Linear resolution maybe not, but pixel count is roughly doubling. Vive and Rift are both ~1k x 1k pixels per eye, first gen Windows Mixed Reality headsets came out a year or two later and are 1440 x 1600, and the just-announced HP Reverb is ~2k x 2k. The teased Valve Index is likely to be 2k x 2k as well, with a field-of-view bump from 110° to 135°.
They talk about a 4 year timespan. The Oculus Kickstarter was in 2012. DK1 was released in 2013, DK2 in 2014, and the final Rift in 2016. The Rift-S is rumored to be released in 2019. None of those have had quite the leaps 3d did back in the late 90s.
For contrast, id went from Doom to Quake III Arena in six years (Dec 1993 -> Dec 1999). Thanks to the rapid pace of graphics card improvement and the engineers writing the software engines to keep up with it (Carmack, Sweeney, etc).
VR has been a wet fart compared to 3d graphics. Tons of hype but very little progress and almost no broader impact on the technology or games industry.
Not sure why you've been so heavily downvoted, perhaps there are a lot of people here working in the field. It definitely seems the case that the average consumer couldn't care less about it, however.
I don't think VR utopia is going to start anytime soon (that will be a slow generational shift), but VR already has sparked the interest of a lot of consumers, that are holding of for now.
I wouldn't be surprised though, if one of the next headsets, i.e. Valve Index, if prices reasonably, suddenly showed unexpected high demand and sold out. There are tons of people, who want to try it, even if its just a gimmick for now. We've seen things like the Nintendo Wii selling tens of millions of units based on a gimmick.
At that point the average consumer might become more aware of VR, and things might start picking up on the software-side.
VR is a high friction/opportunity cost setup clutch and not a casual setup and interaction thing as TV.
The biggest issue with VR is that it's by nature not a passive/low energy thing in it's default mode, it's way closer to sports and work than TV, so more niche as a recreational time activity.
In essence VR competes with activites like soccer, badminton, shooting ranges, lasertag, trampolin parks etc. not general purpose home use TV. If you benchmark it against them VR, on it's current trajectory has a place to exist in garages and arcades.
If VR/MR/AR gets away from being marketed as high energy activity there is a future there, if there is found a way to make headset's superficial or significantly lighter and easy to handle at resolution better than TV screens at the equivalent distance. Powerful wireless (standalone) headsets and couch compatible VR might take off.
In the Wolfenstein 3D black book, written by the author of this article actually, there's even mention of a company trying to do this with Wolfenstein 3D
The problem is that VR performance is tightly coupled to GPU performance. Currently, adding more pixels requires an equivalent increase in GPU performance to maintain quality.
I'm prepared to believe that e.g. foveal rendering will give step-wise improvements without more GPU power, but it's not the same as the years of continual process improvements were for GPUs.
The VR adoption got slowed down quite signifanctly by the artficial price inflation through the cryptomining boom happening in the same period over the last 3-4 years. Before that high-mid to high end GPUs were in a range of USD 200 to 500, inbetween they tripled in price and are now slowly going back. When FOVeated rendering and the equivalent of 2x4k at 90FPS+ are doable significantly below USD 500, i'm sure things will look different quite soon again.
Before that, you could double GPU performance for same price every two years or so, now its about 5 or more years, when looking at GTX 970 vs RTX 2070 for example. In my opinion GPUs speed are at least three years behind trajectory because of this market anomaly. Just hope AI isn't throwing stupid money at GPUs again.
My recollection as a nerdy teenager who had a Voodoo 3 at the time. The Voodoo 4 and Voodoo 5 series were highly hyped and anticipated and after the success of 1, 2 and 3 this kinda made sense. They were quite delayed however and before they made it to market they were slightly blindsided by the NVidia's release of the Geforce 256, which was (iirc) the first graphics card which had its own onboard transform & lighting support, which when supported gave an insane performance bump. In the end the Voodoo 4/5s were released late, board + driver quality was supposedly not great and the range of cards was far less than originally promised (only the low end Voodoo 4 4500 and upper-middle Voodoo 5 5500 were released, the lower-mid range Voodoo5 5000 and ambitious top-end 6000 never saw light of day). Performance wasn't great either, it was generally slower than the G256 even in games that didn't utilise the G256's T&L unit. Then shortly after NVidia released the Geforce 2 series (I ended up getting a Geforce 2 MX, which was insane for the price) which was so far ahead of 3dfx's offerings it wasn't even funny.
Hope I remembered this rightly but I really remember that Geforce 256 being so much better it was like night and day. I stopped playing PC games when I went to university around the Geforce 3 era, so that's when my knowledge of the topic drops off a cliff :D
edit: and now I reached the end of the article it seems Fabien has said exactly this! Note to self: read first then comment
Didn't coordinate with game developers, and tried to pitch a fixed function effects pipeline (the T buffer) when shaders were coming into existence.
Ironically, we've now looped back, and do pretty t-buffer-esque usage with modern DX11/DX12 pipelines.
One of the simplest functions of the T-buffer was to do temporal AA using a fixed function supersampler but also do integration over several frames, which didn't come into being in a modern AAA title until Doom 2016.
For a technology invented in 1999, 3DFx was too ahead of their time.
For a modern example, imagine if the RTX debacle of current gen Geforces destroyed the entire company. Nvidia backed down and released the 16xx series cards, 3DFx went bankrupt instead while everyone else was releasing the 16xx equivalent of that time period (early Nvidia and ATI cards).
They tried to sell functionality that isn't good at raytracing for raytracing, and hardware that has a 4x4 matrix ALU as a tensor unit for AI, but it isn't big/powerful enough for existing AI frameworks to take advantage of in normal usage (nor should a desktop-oriented card have such a thing).
... and then they enabled their raytracing API on GTX 1000 series cards, after repeatedly telling them it requires the hardware acceleration that RTX 2000 series cards have. Not only that, it didn't perform all that badly.
So yeah, that new Radeon series is looking mighty nice right now.
I've a CPU from 2013 - a Haswell, and I was going to upgrade last year until I realized the performance increase wouldn't justify the price at all. Thinking back to a decade earlier and you'd wait 2 years to upgrade your CPU and it'd be a significant increase. Weird times.
It wasn't until consumer 8 core / 16 thread chips started entering mainstream that I even started thinking about upgrading.
And even then its generally meh with how much software is single thread bottlenecked. Meltdown and spectre didn't help things - after they were published I definitely intended to wait for architectures mitigating them before upgrading again.
Heres hoping AMD can keep pushing the envelope this year. A consumer 10+ core chip would definitely have my interest. Not so much to necessarily buy myself, but to drop the floor of the high end even further.
Yeah, I'm holding out for AMD to do something big. At the moment I don't do anything that justifies a ton of cores - my home PC is a gaming machine, and that still likes a couple of fast cores than a bunch of smaller ones.
Hardware is no longer my limiting factor. Software is. There's nothing I want to do that a faster CPU or GPU would help me do. (Well, there's a couple, but software is still a much bigger stumbling block for me than hardware even in those cases.)
Look at the PL progress in that Voodoo 1-5 timeframe. We started with C/C++, and ended up with C/C++ and a little bit of Java starting to creep in. That doesn't seem terribly impressive to me.
Today, if I told you I was writing a new program (and I am!), there's 10 different languages you might reasonably guess that I'd have picked. Life's way better than when everybody was constantly transistor-constrained.
Intel monopolized the market by breaking laws (paid billions in fines). The lack of progress is the consequence.
The progress only slowed down for x86. For the rest of chips it has not: GPUs, mobile SoCs, they all becoming significantly faster with each generation.
Fortunately, with AMD Ryzen the stagnation is ending. Core count in mainstream chips already doubled, after stagnating for decade.
Blast from the past. I have extremely fond memories of moving into my first apartment in 1999 and having friends come over to play Quake on the “voodoo box”. I had three PCs, a 10BASE-T LAN, and one machine that had been upgraded to a 3dfx card. We would play rock, paper, scissors to determine who got to play on the 3dfx box and spend many an all nighter playing till the sun came up. Those were the days...
I still have my original Voodoo card. It's the one piece of computer hardware I've never been willing to throw away. I still remember spending 8 hours downloading the Quake demo and having my mind blown. The only other time that holds a candle to that is when I got the GLQuake version. No other game has ever taken me on a ride like that.
This brings back the memory. I was working for a tiny company in Taiwan at that time. We had been producing VGA cards for a while at that time. An IC distributor brought us a demo board of Voodoo 3D (V1) card. I ran the demo and told my boss that it was the best 3D graphic that I have ever seen. My boss just decided to run with it. Back in those days, VGA card vendor standard practice was to modify the reference in two way. 1. Modify the layout so it can accommodate many different memory chips. If your board can use cheaper memory chip, you will have better profit ( often mean chip which did meet standard memory testing, but still usable with some effort in layout and hardware design) 2. Modify the driver so you have a better benchmark score.
Since we were a tiny company with not many resources so we decided to just manufacture the reference design. This decision enables us to be the first one in Taiwan to ship the product. I cold call 50 companies in Europe. I did not have much success at first since distributors in Europe were not convinced that 3D card can sell. Internet was slow back then, we were still using dialup modem and sending video capture was not an option. I finally got a break when I called the 42nd company on the list: Guillemot (France). Guillemot got their start in PC gaming sound card so they were already interested in the 3D card. Guillemot was talking to Orchid Technology but they need a lower price than what Orchid can offer. Since all other Taiwanese makers were still evaluating or in the development process, we got the business because our price is US$50 lower than Orchid and able to ship right away.
Being 14 years old and getting to experience and see it evolve first-hand with a Commodore (at home), the Apple II (at school) and 286, 386, 486 and Pentium era with a Diamond Multimedia 3dfx Voodoo card was amazing. People who missed it either being busy with other things in life or weren't born yet, believe me, as a technologist being there from the beginning of the home computer market till about 2000 was amazing.
One thing I'm very thankful for in life. It hasn't been the same since around the turn of the century, the magic and mystery is not like it was with PCs or game consoles in decades prior. The advancements were just leaps and bounds every few years.
It sparked your imagination more than things today, because creativity for some reason reduced without technical limitations. Today, within any reasonable definition, an artist's vision can be fulfilled. There's really nothing left for the end user to imagine or fill in the gaps (think Zork).
Not to be crass but this is a relatable example for many I'm sure- it's no different than finding a Playboy magazine back then, as opposed to extreme, explicit hardcore websites today. There's no mystery at all there, and it's not really an upgrade from your imagination being used at least a little bit.
I've actually rediscovered books because most media today (as in movies and sitcoms, not THAT sort of media) is so poor quality. It's really all about the writing, and I struggle to find games and films that are at the 20th century or prior quality level. Books can be exquisite entertainment, and leave plenty for your imagination to run free with. Which for me, is what it's all about. That's the joyful part to entertainment, or at least a part of it that I find critical.
It is because you experienced these things during your formative years that they have such a unique luster, and that nothing seems to quite compare.
People who are in their formative years today will look back upon the current times with similar enthusiasm; just like your elders a few decades ago found that your Playboy magazines and 3D video game cards were pointless debauchery.
So it has been for the billions of humans who have lived before us, and so will it be for the billions who come after us.
People in their formative years see technology so clearly. Each generation invents something that changes things in ways people wish they could relive. Past the change, you can’t see things the same.
I've definitely considered that thought over the years, again and again. I do think there truth to it and plays some role here, but overall would respectfully disagree with it as a flat out conclusion if used as a dismissal in this case. It's worth expanding on because you're not just wrong outright, it's impossible to objectively state that nostalgia isn't a factor, but I truly do think I'm onto something here on witnessing the home PC market first-hand from the beginning. It really was special because of the intimate nature of it, "home PC" market.
The reasoning that immediately comes to mind is two-fold.
The first is the obvious point I made, where from the very start I place books as the supreme medium, and are far before my time. The written word can be information dense, puts your mind and imagination to work and if in a book, never needs recharged. :) It's just underrated, underappreciated. Books are "inferior technology", which to me is the ideal abstraction layer for our species. I think we'll continue to see back to basics as a movement as corrosive societal effects from "social media" plays out. We need real community, the kind humans evolved for, not that marketing nonsense chewing people's brains up and spitting them out only left with capacity for short-term attention spans. Most people don't need help with that. Facebook is the new smoking. Things don't always get better with every generation for billions of years, nor do most witness great change. You're assuming that because the last 300 years have been relatively action-packed. Mostly thanks to Europe's astonishing leap forward around 1600AD in and out into the world. Things are objectively worse today from the 20th century, everyone knows it or can definitely feel it. The deck is stacked against a young person with opportunities slowly dwindling. That trend may continue if we don't solve capitalism collapsing on itself in the western world, discover new antibiotics, among numerous other very serious challenges that aren't being effectively addressed. Our big, impressive move lately (speaking for the US) is simply cutting taxes when already at historically low rates. Brilliance.
The second thought I had is that the home PC space undeniably hit it's stride from ~1980-2000. That's just when that market had its golden age. Combine it with the arcade experience of the day, and you had a sensory experience that really isn't even widely available today anymore. You can't really explain it to someone who didn't see it. It's not just generational placebo. It's like the circus. They hardly exist now, but they were worth the trip and I regret many kids may never go to one. Someone who did miss the way it was before would likely insist I'm just a fool, but Netflix and Youtube is not a fair replacement for these things. They're just not.
You have to go hunting for an arcade today, and I'm not even sure if there are any truly modern ones around. A kid just won't get that sensory experience, which isn't just technological but the social element of all the kids being there too. That goes without mentioning the loss of comic books stores that kids rode their bike to(!), toy stores, candy stores, Saturday morning cartoons and all the waiting, anticipation and excitement attached. Just nostalgia? Or is it real. The examples sound real.
Back to addressing your point, certainly the cotton gin, the steam engine, home electricity (which my grandma told me about when they were the first house in town to have it installed because her dad was so enthusiastic about it), among others, were more monumental on a macro level than seeing the home PC space explode from 1978 to 2000. Yet I have my doubts that an old timer was passionately reminiscing about "seeing the cotton gin come to be", and how amazing it was. It harkened great change to society, but I'm not sure people were living their lives in a way at that point to personally experience rapid iteration as people witnessed in the home PC market. Being a market specifically targeting them/us. It really is different today than it was then, it has normalized, there's less excitement for sure, and it's far more difficult to be impressed with advancements.
We're at the point now where we need to go back to step one, bring back creativity. That's why I circled back to books being the ultimate medium. What good is an 8K TV is there's nothing good to watch? Zero. I'm back to hunting for good books instead. Choose Your Own Adventure were better than what Hollywood is putting out today. Writers often do the best job at expressing thought-provoking creativity, which without that human spark of creativity injected into your technological medium, it would all be pointless.
From our human perspective at least, once they're unleashed, our AI overlords won't care.
> Things are objectively worse today from the 20th century, everyone knows it or can definitely feel it.
That may be true in the US (although of course it's debatable). In the global scale however, things have improved tremendously. For example, my region, Central and Eastern Europe, has been freed from comunism and flourished for the most part after 1989. But that's small change compared to vast improvements in living standards of hundreds of millions in Asia (China, India, other countries). So, it'd say the boom is far from over, it's just getting more evenly distributed now, as opposed to most of the XX century, where the US was in an unique position to get so much richer than almost everybody else.
Indeed. My point of view is fixated on a working-class perspective from the USA. I wouldn't say things here feel like hell, it's just more effort and people aren't used to studying or instilling discipline in their children so they can get up to par. There's no more jobs where you have no advanced education or specific skillset, where you walk in and walk out buying a house, motorcycle and 2 cars. People aren't used to not having that available. I've spent most of my career trying to be skills-based to not fall out of the middle class here. It's no longer easy, and it was felt after 9/11 and definitely after 2008.
I certainly dislike the shift of money out of the US (and western Europe, which is really in the same boat as we are), but tough to wish poverty, dismay or anything of the sort on the rest of the world.
In my career so far, I've worked with many people from Bulgaria, along with India. Really enjoyed the experience to be honest. I'm a best-person-for-the-job sort of guy (a concept that is actually falling by the wayside in the increasingly job-strapped USA in favor of nepotism and cronyism), and don't fret over macro-economic issues that I don't control. Just do my best to survive in the environment I was born in like everyone else.
I wouldn't say everyone is getting a great deal out of the US's economic system eating its own people, plenty of the world is being ignored outright. Eastern Europe and Asia are definitely some hotspots gaining though. I think global scale might be a stretch. But I could be wrong, just thinking about the parts of the world where capitalists aren't investing as much.
I love talking to folks from Bulgaria and such about their lives from the Soviet era till today. One told me that "we didn't have much, but you didn't care or notice because everyone else was". Lack of envy etc, that individual was actually admiring his previous life in ways, which I understand given how miserable so many people appear today..
Very interesting perspectives on life and the world, to me at least. I also enjoy teams with folks from India, usually upbeat and smiling and that's (almost) all I ask for. I also suspect working with developers in Latin America would be a similar experience. The culture differences get a little abrasive though once you move out of India into the rest of Asia.
Not to be crass but this is a relatable example for many I'm sure- it's no different than finding a Playboy magazine back then, as opposed to extreme, explicit hardcore websites today.
Never mind the pictorials, the depressing thing about finding a Playboy issue from the 1960s era is the sheer literacy of the thing. You really could read it for the articles back then.
I was working on 3D rendering and games around this time - pretty much all the PC cards were burdened with terrible CPU->GPU interfaces. The handshaking, setup calculations and register wrangling was such that below a certain triangle size, it would have been quicker for the CPU to just draw the pixels itself. Some cards (Um, Oak?) required per-scanline setup.
I got one of these cards - confirmed it was indeed hella fast (even for large meshes of small triangles), and then dropped into SoftICE a few times, winding up at this code:
My thoughts were - "Wow, somebody gets it!" - Very tight triangle setup, and a simple PCI register layout that means that the setup code just writes the gradients out to a block - the last write pushes the triangle into the huge on-card FIFO.
That performance, along with the simplicity of glide, made it a a no-brainer to drop all other card support and focus on that exclusively.
When I first saw 3dfx graphics, I knew it was the dawn of a new era.
I bought thousands of dollars in stock in the company. I lost it all because unfortunately nVidia ate their lunch because they marketed themselves better. They had full 32-bit color vs 3dfx who had the superior technology but only had 16 or 24 bit color. 3DFX spent a lot of time trying to explain why it didn't matter but in the end it did. It mattered to the gamers at the time, and they basically died and I lost a huge amount of money that took years to pay off. It took me a long time to move over to nVidia because of my hurt ego but they were the superior technology in the long run.
32-bit colour is still just 24-bit colour with 8 bits of stencil or whatever (or at least was back when 3dfx was relevant). The only company I know of that was doing higher bit depths was Matrox who had 10-bit colour iirc.
I remember back in the day "serious gamers would never use a combined 2D/3D card". Then nVidia came out with their Riva chipset and suddenly 2D graphics cards weren't really a thing any more.
I had a Riva 128, what a piece of crap it was... I used to assemble gaming boxes for friends and delaying delivery a day or two to take a ride on the Dual Voodoo setups.
Edit: well, according to Wikipedia it was somewhat competitive although drivers and support for rendering stacks was erratic for a while. I guess that’s what soured me up, and I was probably too n00b to have a clue
Huh... We had a Diamond Viper V330 or something and it was rock solid. Worked fine on everything I threw at it, seemed comparable with a Voodoo 2 in terms of speed/quality. Maybe it was the card vendor rather than the chipset?
It wasnt 32bit color that made Nvidia emerge the winner. It was relentless push forward and an army of EEs working around the clock. On all accounts Nvidias first product should of been their last, supporting wrong paradigm, wrong specs, total fail. Yet somehow VCs allowed them a second try. After that it was just a straight dog race.
Just think about it, 6 months between GPU generations, Nvidia had an army working 24/7 while 3dfx had couple ASIC people, 3dfx was murdered in their sleep.
I remember buying my first 3D 'accelerator card' as they were called back then. It was a Voodoo Banshee card. The Banshee had an onboard 2D video chip, so it didn't have the VGA passthrough cable.
I bought the card at a trade show (the Dutch audience here will remember the 'HCC dagen'). That's where you could buy them cheap. Not sure if it was actually cheaper, internet wasn't very useful back then, so there was no easy way to compare prices.
I didn't have a computer of my own yet (I must have been 14 years old or so), so I bought it for our 'family' computer, an IBM P166. I remember getting up super early to put the card in before my dad would wake up. He would certainly have freaked out if he saw that I opened up the expensive computer to put it some gaming thing.
Remember clearly buying my Voodoo Banshee from Babbage's for $250 USD. My high school girlfriend at the time was so annoyed that I would spend so much on something so strange at the time.
GLQuake blew my mind, but I was just trying to get Daggerfall to run acceptable more than anything at that point, and run it did.
Oh, memories. We built a custom arcade board based on SLI Voodoo1, mips r5k, and custom jpeg decompression in hardware to do Magic the Gathering: Armageddon. https://youtu.be/cci5l21aMss We wanted to run at highrez (640x480 vga) at 60hz with tons of animating sprites on screen and realized nothing had the fill rate we needed. Rather than changing the game design, we kept looking and met the 3Dfx team when they had barely taped out the Voodoo1. We gambled that SLI would work and started designing the hardware and adapting the game to an early alpha of Glide. We prototyped on SLI cards running on Windows NT and pentium pros. In parallel, Atari Ganes built Voodoo1-based hardware to replace Zoid. Mace and War shipped on that hardware.
My claim to fame is that the only Voodoo card I've ever owned was a Voodoo 5 6000.
I picked it up in the early 2000s during a pre-closure clearout at my then-employer, a UK video games developer. The sheer size of the thing made me LOL, that and the number of fans and the additional power connector. Then I noticed it was a 3dfx - oh, hey! I could play that Glide-based motorbike game, that I remembered enjoying at a friend's house a few years previously.
The game wasn't as good as I remembered. I threw the card away.
I picked up my Voodoo 2 in 2005 when I found it in an old computer I was scavenging for pieces with other high school students.
I was young, but I immediately recognized it and traced it back to the computer magazines (from 97/98) of my older syster that I used to devour. No other student realized what it was. They were puzzled by the double VGA connector :)
I took it home, and installed it in my Athlon XP 2600+ system alongside the ATI 9800 Pro, to finally try the Glide games that I never had the opportunity to play years before... The first time I saw the full-screen 3dfx logo, it felt amazing.
Not only I still have it, but I collected more for free over the years, and I'm still amazed that something I would drool over a few years back became trash-worthy. It's either a reminder that good things happen to those who wait... or a memento morì.
My flatmate owned a Voodoo2, was impressive though the market soon caught up and and when the following year I did get myself a 3500, after a month I sold it and got a Matrox G400Max as it did the 3D level I needed (more so the less demanding games of the time). But more so, the colour gamut was so much sharper and stood out on a CRT of the time.
I was curious so I had a dig on the specs to relive the decision of the time and can see the Matrox did 32bit colour whilst the 3500 was 24bit. Not seen any comparisons in performance but I certainly had no complaints and was happier with the G400Max on many levels (2nd monitor - no problem).
[EDIT ADD] This looks worth a watch for nostalgia circa 1999 graphics cards and compares the G400MAX, 3DFX 3500 and the TNT2 Ultra https://www.youtube.com/watch?v=-4LvoGQ2lgI
24bit and 32bit both have 8bit per RGB color channel, the extra 8 bit are either alpha (for textures) or padding (for performance). All other things being equal, colors should look the same on both.
Yes, but keep in mind that monitors of the time were analog devices. Matrox was well known for higher quality at the time. It was visible even just with black text on a white background. Basically their implementation of moving from pixels to the analog signal was better. Less artifacts, less shadows, better colors, better frame to frame stability, etc.
If I remember correctly Matrox cards had the best video output quality through D-Sub at that time, which might be what your post's parent might have referred to.
Yes, I was trying to quantify it from experience over just saying it was the best. They sure was and some would say, still has an edge, though some still prefer CRT's as well.
I graduated from high school in 1999 and worked at a big box computer store during the spring and summer. I remember when the demo machines with the Voodoo3 3500 and TNT2 Ultras hitting the floor and all of us employees crowding around before the store opened to watch the demos, and endless hours spent arguing about which was better.
I saved up money to build my first computer that summer and chose a Creative 3D Blaster TNT2 Ultra, paired with an Abit BX6 motherboard, a Pentium 3 450 CPU, 256MB of RAM, a white box Sound Blaster Live, a 3Com Etherlink XL, a US Robotics 56K modem, generic 40x CDROM drive plus a CDRW drive, and I want to say an 8GB Western Digital hard drive, connected to a 19" monitor that was so big I had to pull my desk away from the wall to fit my Model M keyboard. It was pretty much perfect, except for the fact that technology was improving so quickly then that I started feeling the need to upgrade it in 6 months.
That fall I went to college and hooked it up to the university's broadband connection. I probably got my money's worth in all the hours I spent on that machine. It definitely gave me a sense of wonder and power to have something top of the line. I spent countless hours playing Half Life multiplayer, Team Fortress, and then Counter-Strike beta. I don't know if I'd do it again, but I had a great time back then.
Such nostalgia! As my first job out of college, I worked at Matrox at the time helping to write the 3D OpenGL drivers. I pretty much joined the company because I wanted to make Quake run faster. We played Quake just about every night there in order to "beta test". Good times!
Nice article. I would be interested in learning more about the rendering pipeline. Is anybody aware of a more detailed description? Or an online datasheet?
Btw - already EDO DRAM favours reading entire memory blocks/lines. My suspicion is that the trick in increausing memory bandwidth is not only based on interleaving, but also on block transfers with on-chip caching. Especially critical for texture reads.
The Voodoo is a nice example of how much execution matters. They were not the only ones to follow this path back then, but by only concentrating on the core functionality they managed to beat all others to the market without compromise. (Compare to S3 Virge, Matrox Mystique, Nvidia NV1, Tseng, NEC and many others)
Loving those early 3d hardware retrospections. Some corrections:
>they started they own company
typo
>EOM'es only leverage on the cards they produced was the RAM they selected (EDO vs DRAM), the color of the resin and the physical layout of the chips. Pretty much everything else was standardized.
- EDO is DRAM, EDO vs FPM? I didnt know that, always assumed every V1 shipped with EDO, just like every card on your pictures is EDO.
- video signal switching was also up to the vendor (relay/mux)
- and one of the cards has TV encoder section with TV out, pretty neat selling point
>It is not specified if the bus used address multiplexing or if the data and address lines were shared. Drawing it non-multiplex and non-shared makes things easier to understand.
You are addressing 512 kB, but datapath is 16 bit/2 Byte wide, so we only need 18 bit address bus. As for multiplexing thats not how DRAM addressing works IRL (understandable misconception/simplification for non EEs). Row/Column address lines are multiplexed, meaning we are down to 9 address pins +OE/WE. Comes down to ~110 pins assuming full 4 way interleaving. Seems doable with >200pin ASICs.
>21-bit address generates two 20-bit where the least significant bit is discarded to read/write two consecutive pixels.
still too many bits, 2MB at 4 byte granularity is 19 overall, 18 per 1MB bank
>TMU was able to perform per-fragment w-divide
This was a HUGE deal at the time, and achieved by doing serious low level optimizations/tricks (lookup tables/approximation if I remember Oral Panel correctly). 3dfx engineers were big fans of good enough hacks vs slow but correct way of doing things. Another one was color dithering, too bad you didnt mention "24-bit color dithering to native 16-bit RGB buffer using 4x4 or 2x2 ordered dither matrix" - this is the reason straight ram dump screenshots from Voodoo1 dont really look the same as on directly connected monitor. 3dfx called it ~22bit color, it was noticably better than Nvidias pure 16bit.
Btw afaik Quake pushed somewhere between 500-1000 polygons per frame, earlier games like Actua Soccer rarely went up to 500 with fatal consequences of single digit framerate on S3 Virge. You might enjoy Profiling Of 3 Games Running On The S3 ViRGE Chip http://www-graphics.stanford.edu/~bjohanso/index-virge-study...
Playing Unreal and coming out of the spaceship at 800x600 on my voodoo2 and 400mhz Pentium II (I was spoiled and got the !2MB version!!!) was still one of the greatest gaming moments of my life
Unreal: Return to Napali and Unreal Tournament at 1024x768 on a GeForce 2MX with 32M DDR ram on a K6-2 300MHz.
These GPU even could handle Doom 3 (at a barely playable frame rate) at low Res and scaling down textures.
They forgot to mention at the end of the article that 3dfx filed for bankruptcy as they literally became "drunk on their success." Everybody started showing up to work intoxicated. Some investors got left out in the cold.
The Voodoo2 was my first 3D video (addon) card. I was working tech support at the local ISP, and a failure on a circuit caused all authentication to fail for dialup customers for a few hours. They called in every off-duty support person they could get to come in and along with the scheduled staff we all (~20 of us) took 3-4 hours answering calls telling many of our 20k customers that yes, we had a major problem, no, we couldn't help them at that time, and hopefully it would be fixed within a couple hours.
To thank the support staff, the management/owner offered the choice of a Voodoo2 card or a DVD player (~$200 each IIRC) to every support rep that helped with the load that day. I ended up working for that company three separate times in different positions, leaving for college and coming back, and working for a relative's company for a while and returning again later. It's the wonderful people there and actions like that which keep that company in a special place in my heart. (For those wondering, it's Sonic.net, now Sonic.com, or just Sonic maybe. I'm not sure the official branding, and I have years of history with it being Sonic.net)
I remember the first time someone told me about a card that you could put in your computer that would make quake run faster. It sounded so strange and exotic. Then I saw the screen caps with all their anti aliased glory and was devastated that I could never afford it on a movie theater clerk's salary.
It was even worse with Unreal, especially when you saw the live intro somewhere. That money could buy this was unexpected - even after Q1 and Q2 - and amazing.
How would it not run faster, given that otherwise CPU had to shoulder the rendering entirely, in addition to everything else? Even high-end CPUs at the time would struggle with that, especially with resolution and bit depth being equal. IIRC the original Quake didn't even allow more than 256 colors in software rendering for this purpose, and even then I recall having to run it in weird resolutions like 400x300 to get framerates comparable to what Voodoo owners had for 640x480x16bit.
The Voodoo was the indisputable king of 3D accellerators back then. Not everyone was running GLquake with a Voodoo card in their PC though.
I had an S3 Virge in my PC at the time. It was quite possible to get GLQuake running using a Direct3D to OpenGL wrapper, but damn it was slow. It sure looked better than regular Quake, but it wasn't really noticeably faster than software rendering.
I collect vintage electronics, mostly computers and gaming consoles. The Voodoo cards are considered quite collectable because of their GLIDE API. There is a decent stack of games that will ONLY run on Voodoo cards, or will only support Voodoo natively, which matters to a lot of collectors. Back then, a good eye could sometimes identify a card just by looking at the game. For one, for a brief period, games were only made to run on specific cards, so you could narrow it down just by what game is playing. Also, the artifacts would give them each a different look. Some were grainy, washed out, etc. The Voodoo 1 cards would stand out by the artifact that I can only describe as what looked like faint artificial scan lines.
They are really special cards, and two of my vintage computers were built around a Voodoo 1 and Voodoo 2, meaning, I started with the cards, and knew I had to build a PC to run them optimally.
The Voodoo cards had that odd glossy, liquid look that made the surfaces of objects seem deeply physical. The ADCs (EDIT: RAMDACs, of course) that they were using did an odd sort of "poor man's antialiasing" and running any games from that era on modern equipment is a shocking exercise in explicitly defined pixels.
They are already collector's items? :-/ For quite some time i wanted to buy one to play around with Glide (on a real machine) but i postponed it since it also requires building a full desktop and i do not have space at my current place :-P.
Perhaps i should try and find a couple of those before they disappear.
I remember the first time I saw that you could walk right up to a wall in a game and the texture didn't become a mess of giant pixels. Sure, it was a mess of blurry pixels instead but still it seemed like (voodoo) magic to young me.
The two main issues that GLQuake has over software one are that the textures are squashed down to the nearest power-of-two and lightmaps are clamped in the "0..100%" range whereas the software rasterizer has a (sort of) "0..200%" range. You can see this in the comparisons in the site where bright areas are dulled.
Modern source ports have fixed both issues though (FWIW i recommend Quakespasm)
If you only had or experienced IBM-compatible PCs. If you started on a Commodore or Amiga, the PC speaker to Sound Blaster/Adlib shift was more akin to just catching up. I wasn't completely happy until the 486 era because of this.
The Voodoo1 4MB which I had back then as well, was unprecedented. I agree that the Sound Blaster was great (in isolation against older x86 PCs), but 3dfx was the first standout example of the fruit in the IBM-compatible space, and a clear demonstration in the gaming sense as to why it took over the market.
The first standout example was VGA and games that fully took advantage of it. I was an Amiga user but AGA was too little too late and architecturally a disaster for certain games (DOOM, Quake). Even games like Monkey Island and Indiana Jones were a lot better on VGA than Amiga.
The game that did it for me was Privateer vs Elite II. I had to get a PC after I saw it. It was crystal clear the Amiga could not compete.
I think the equivalent of the Voodoo in the soundcards-space would've been the Gravis Ultrasound. In both the evolutionary and soon-to-be-forgotten sense.
PC Speaker vs. Soundblaster is more EGA vs VGA, something much more profound. And beneficiary to the gaming space, as opposed to early 3D, which basically ruined all the things for quite a while...
Going back to 2D was my sincerest wish back then, when they tried shoehorning everything into 3D. There was no reason back then for having e.g. jump and runs, adventures or RTS games in 3D. It worked fine enough for FPS, which is why that genre dominated so much back then.
And yeah, I knew that the GUS only enjoyed a very short-lived popularity, just like Voodoo cards when the Geforces etc. showed up. Okay, maybe the popularity at its height wasn't quite equal, but the next 3D fads are quite a few steps down (e.g. PowerVR or the first Matroxes).
The site used to run on PHP but now it is generated static HTML. I wrote my mini framework because I did not know of any site generator. I learned a lot but overall that was time I could have invested better.
The readability part is just the css based on monospace ans justified paragraphs.
Doesn't mean they literally hand-code every HTML file on the site. That would be highly unlikely since you'd have to change every file any time you update, say, the navbar.
You should be able to do something with labels, CSS sibling selectors, and the :checked pseudo-class if it matters to you. I found an example at https://codepen.io/ancaspatariu/pen/WpQYOP which does quite a lot more than you’d need.
I bought one my junior year of college as a CS major. I had some good internships and that year I had some money. Instead of a car I had built a Pentium MMX 200Mhz box with Diamond Stealth card & 16MB ram. Pretty hot machine among my friends at that time... when the Voodoo 1 became available I was able to get one and it's performance was mind blowing at the time, even though I had access to SGI machines and such on campus that had way more impressive demos on them.
My senior year of college I took an Open GL course and did a bunch of my projects in linux with the Voodoo 3D drivers. Cool stuff. Played a lot of quake too, I remember writing a program to render the quake characters on my own as one of my projects. The data model formats were open source IIRC so it wasn't too hard to read in the data. Very cool since we didn't have any good 3D tools to build our own models.
I remember playing AH-64 Longbow or something on it too.. some of the flight sims were amazing at that point right before flight sim popularity tanked at the same time the remaining programs got unbelievably complex.
Voodoo was kind of a pain the neck in day to day usage. In 1999 I built a new machine and went to an NVIDIA Riva TNT and then later that year got a GeForce 256 when those came out.
Kind of the end of my heyday of PC gaming.. the combination of working on computers all day + games at the time still requiring a lot of debugging to get them to work well wore me out.
I worked with all of the cards mentioned in the article. It was a pretty sweet time, where the card manufacturers would happily send out a reference card.
Some things the article did seem to miss out on:
- You could have two Voodoo's in your PC for extra throughput (I can't remember the numbers). I seem to recall there was a ribbon cable between the two boards...
- The reason 3dfx ultimately failed was due to hefty lawsuits ongoing with NVidia about IP theft and headhunting the 3dfx staff.
During this time there was a mailing list (can't remember it's name) that existed and a lot of game devs operated in it, mainly around DirectX (v1 onwwards), but it was in existence much before that. All the card manufacturers were on it that I recall.
One day John Carmack posted a comment (I'm paraphrasing somewhat) how rubbish DirectX and Direct3D was. A month or so later glquake was available.
I think it was about 12-18months later Unreal (the game... before the engine) was announced as a demo on this list and we all thought: Awesome -- who the * are these guys!?
I'd like to say 'Good times' were had, but seriously, I burnt out due to the insansely fast changing pace of 3D dev during those times.
I recall purchasing two of these in the late 90s for relatively little money. Both cards could work in conjunction and it radically changed how games looked—really amazing how much of a difference it made. I can distinctly recall the difference in crispness, and how disappointing it was that didn’t apply to everything I used or played (3DS Max, CAD, etc).
They were the first thing I ever sold on eBay, sometime around 1999.
The difference from software-rendered Quake 2 to hardware rendering was massive, not least because of the colored lighting. In some places, it definitely bordered on overkill.
I wanted a 3DFX card so bad in 1998 but I'd already convinced my dad to pay extra for a Matrox Mystique 3D card (which didn't support openGL) and he just couldn't get his head around why i now also needed another 3D card add-on. Even seeing the names of all those cards brings back so many memories.
Ahh man I remember GL quake dropping and before the voodoo, or Riva 128 I had the ... Matrox Mystique 4mb!!! Was amazed and also sorely dissatisfied.
Thus began the video card race/upgrade journey.
Voodoo, Riva 128, tnt, voodoo2, lol don’t forget that power VR, what an amazing time to live
Such good memories. I spent all of my first few high school job paychecks on a Diamond Monster 3D. In retrospect, adult me says I should have bought $AAPL, but 16-year old me is still pretty sure it was the right decision.
I had a Voodoo Banshee (combined 2d/3d card). It wasn't as good as a V2 but was OK for me back in the day.
I then moved to NVidia predominantly (TNT2 Ultra), although I did pick up a cheap V5 5500 which I ran for a bit.
Like other's have said, it was a fun time to be involved with PC gaming. Unfortunately life has got in the way since, although I do spend time on Vogons looking at old systems and wondering if I should build a couple of retro machines!
You definitely should. I have a 486 DX4/100 running DOS / Win3.1 and a Slot A Athlon 650MHz with a GeForce MX440 running 98. So much fun reliving my youth in DOOM, Red Alert, and Half-Life!
If you need justification you can tell yourself that these machines will only go up in value. Works for me ;)
The main reason for the success of 3dfx were drivers that let you talk to bare metal hardware registers instead of dozens of GL abstraction layers. The simplicity of it all is very impressive. This is something programmers absoluteley love. Hence many games supported it and it was very easy to write drivers. Maybe it is time to reimplenent the glide interface in an FPGA ang make a super powered Voodoo card.
The only success the drivers had were the lock-in they created to the hardware with Glide, which itself was a thin layer on the underlying hardware. Many games used Direct3D or OpenGL without Glide support and worked just fine.
Hell, Quake itself was written on miniGL which was a subset of an abstraction layer built (IIRC) on top of Glide.
And really, OpenGL at the time was simple too. [Check this OpenGL 1.1 reference](http://www.talisman.org/opengl-1.1/Reference.html) which also includes GLU and GLX function and even with those, the number of calls is very small.
OpenGL games should run fine, some old games that used static sized buffers to hold extensions (like all Quake/Quake2 games that dump the available extension string to the console and the console having a hardcoded buffer for each line) will crash it because of how long the extension string is nowadays, but most drivers have an option to limit the string size (AFAIK Nvidia and AMD drivers check the executable name and do it automatically for known faulty games).
For Direct3D games you must use a 32bit color mode otherwise Windows 8.x/10 will force software rendering which is very slow. Even better, use dgVoodoo2 which is a reimplementation of DirectX 1 to 7 (with a some bits of 8) in Direct3D 11 and provides much better compatibility (also gets rid of the Direct3D 7 2048 surface width limitation, making it possible to play games at 2560x1440 and up).
For Glide, dgVoodoo2 is also very good and you can "cheat" the game to force higher resolutions than what the game thinks it is running at.
I have a lot of old games and everything that isn't DRM encumbered works fine in Windows 10 using dgVoodoo2 and/or some game-specific hacks (Tomb Raider 1 for example is normally a DOS game that you can play using a Glide-enabled build of DOSBox but there was also a Windows version made that used an ancient proprietary 3D API by ATI - someone reimplemented that API and placed extra hacks in there for high resolutions and widescreen support).
Yeah, actually the other day i was thinking about getting a 2990WX to try writing a software renderer on it and see how it performs. Too bad it is way too expensive (and Zen 2 CPUs are around the corner).
Whenever i re-read Tom Forsyth's (who worked on it) article about Larrabee [1], i facepalm hard at Intel's decision to drop it. It is one of the few cases where i really wanted to put my hands on that sort of hardware. An essentially standalone manycore CPU on a PCI express form that is running FreeBSD or Linux, with a bit of graphics specific functionality that doubles as a video card and you can fully program like a PC?
I have "I WANT! I WANT! I WANT!" signs blinking in my head as i type this :-P.
(facepalm mainly because i wanted it, i do not know the economical details)
Funny enough, the Forsyth's (and another one from another team member i do not remember now) article sold me more on Larrabee than any PR speak Intel made.
A cow-orker of mine mentioned this card a few months before it was actually released. "Transparent water," he would say, wistfully. I wasn't convinced it would be that interesting.
We both bought cards. I was convinced.
I really wish I'd gotten back into game programming then (I was doing mostly systems stuff, boring things like storage and operating systems). It would have been a lot of fun.
I remember being obsessed with graphics cards back in the day. My first graphics card i knew of was a 3dfx voodoo. I loaded it up to play Everquest. I vaguely remember going all the way up to a gforce 3 or so trying to get my games running smooth. Because of games I learned what every component in my system did and eventually started coding too. I miss those days :).
I was one of the poor folks who could only afford the Matrox m3d (aka PowerVR PCX2), which shuffled video data to and from the 3d accelerator over the PCI bus, and was therefore slow as crud.. but, I had GL Quake!
The 3dfx Voodoo1 was the first hardware I bought with my own money to put in my 200Mhz AMD K6 machine, which I kept cool with two industrial fans. I paid R450 (~$30) for my Voodoo1 and have fond memories of multiplayer Half-Life 1 and Alien vs. Predator (AVP1) with friends.
I was quite sad to exchange my newly bought 3dfx card by a NVidia TNT one due to PCI connection issues on my motherboard, specially since I was looking forward to play with Glide.
It stood for "Scan Line Interleave". Basically each voodoo card would be responsible for rendering half the display, alternating scanlines between the cards.
The Voodoo3 3500 TV AGP? I had one of those (after a Voodoo 1 and a Voodoo Banshee) and it was a huge disappointment to me. The whole thing seemed terribly unstable. Heat issues? driver bugs? hardware conflicts? I could never figure it out.
On a side note, that blue cable was so incredibly stiff you could probably club somebody over the head with it and it still wouldn't bend.
Maybe it was the voodoo 2, but I seem to recall that 3dfx cards could render into a chroma-keyed window, not just full screens. In any case the main problem with them was they made your 2D desktop look like junk for the 99% of the time you were not playing Quake, due to the extra analog-digital-analog conversion.
One of the best features of the Voodoos was that they were a cheap, early-adopter way to get dual monitor debugging. Full-screen debugger on one monitor. Full-screen game in a breakpoint on the other.
Afair it was the terrible Voodoo Rush that could render into a window.
There was no analog-digital-analog conversion, it was straight analog pass-thru, either using a mechanical Relays or special video switching muxes. Slight signal degradation was due to additional cables and connectors.
The pass-through from your regular VGA was just connected to a relay that was switched to the Voodoo's video DAC when in use, and pass-through otherwise. Switching back and forth would create the typical breaker sound. There was no digitizing, chromakeying, or genlock going on. Still, signal quality could be worse b/c analog.
This was done my some MPEG decoder cards, like the Sigma Designs RealMagic. The Voodoo didn't process the input VGA signal itself in any way. Image quality suffered slightly simply because the signal had to travel further and through a few more components, but there was no significant degradation of the kind you describe.
Voodoo 1 and 2 were simple analog VGA pass-thru, they'd take over the VGA output when in use. I don't think the VGA input was even wired to anything other than the pass-thru switch, you may be thinking of something else. There was no such conversion.
I want to say I still have this in my basement in a box. I also want to say I have this in my old workstation computer and it still works, sitting in my basement.