My recollection as a nerdy teenager who had a Voodoo 3 at the time. The Voodoo 4 and Voodoo 5 series were highly hyped and anticipated and after the success of 1, 2 and 3 this kinda made sense. They were quite delayed however and before they made it to market they were slightly blindsided by the NVidia's release of the Geforce 256, which was (iirc) the first graphics card which had its own onboard transform & lighting support, which when supported gave an insane performance bump. In the end the Voodoo 4/5s were released late, board + driver quality was supposedly not great and the range of cards was far less than originally promised (only the low end Voodoo 4 4500 and upper-middle Voodoo 5 5500 were released, the lower-mid range Voodoo5 5000 and ambitious top-end 6000 never saw light of day). Performance wasn't great either, it was generally slower than the G256 even in games that didn't utilise the G256's T&L unit. Then shortly after NVidia released the Geforce 2 series (I ended up getting a Geforce 2 MX, which was insane for the price) which was so far ahead of 3dfx's offerings it wasn't even funny.
Hope I remembered this rightly but I really remember that Geforce 256 being so much better it was like night and day. I stopped playing PC games when I went to university around the Geforce 3 era, so that's when my knowledge of the topic drops off a cliff :D
edit: and now I reached the end of the article it seems Fabien has said exactly this! Note to self: read first then comment
Didn't coordinate with game developers, and tried to pitch a fixed function effects pipeline (the T buffer) when shaders were coming into existence.
Ironically, we've now looped back, and do pretty t-buffer-esque usage with modern DX11/DX12 pipelines.
One of the simplest functions of the T-buffer was to do temporal AA using a fixed function supersampler but also do integration over several frames, which didn't come into being in a modern AAA title until Doom 2016.
For a technology invented in 1999, 3DFx was too ahead of their time.
For a modern example, imagine if the RTX debacle of current gen Geforces destroyed the entire company. Nvidia backed down and released the 16xx series cards, 3DFx went bankrupt instead while everyone else was releasing the 16xx equivalent of that time period (early Nvidia and ATI cards).
They tried to sell functionality that isn't good at raytracing for raytracing, and hardware that has a 4x4 matrix ALU as a tensor unit for AI, but it isn't big/powerful enough for existing AI frameworks to take advantage of in normal usage (nor should a desktop-oriented card have such a thing).
... and then they enabled their raytracing API on GTX 1000 series cards, after repeatedly telling them it requires the hardware acceleration that RTX 2000 series cards have. Not only that, it didn't perform all that badly.
So yeah, that new Radeon series is looking mighty nice right now.