High end GPU has over the last 5 years slowly turning from an enthusiast product into a luxury product.
5 or maybe 10 years ago, high-end GPU are needed to run games at reasonably eye candy setting. In 2025, $500 mid-range GPUs are more than enough. Folks all over can barely tell between High and Ultra settings, DLSS vs FSR, or DLSS FG and Lossless Scaling. There's just no point to compete at $500 price point any more, that Nvidia has largely given up and relegating to the AMD-built Consoles, and integrated graphics like AMD APU, that offer good value in low-end, medium-end, and high-end.
Maybe the rumored Nvidia PC, or the Switch 2, can bring some resurgence.
What strategy? They charge more because manufacturing costs are higher, cost per transistor haven't changed much since 28nm [0] but chips have more and more transistors. What do you think that does to the price?
In their never ending quest to find ways to suck more money out of people, one natural extension is to just turn the thing into a luxury good and that alone seems to justify the markup
This is why new home construction is expensive - the layout of a home doesn’t change much but it’s trivial to throw on some fancy fixtures and slap the deluxe label on the listing.
Or take a Toyota, slap some leather seats on it, call it a Lexus and mark up the price 40% (I get that these days there are more meaningful differences but the point stands)
This and turning everything into subscriptions alone are responsible for 90% of the issues I have as a consumer
Graphics cards seem to be headed in this direction as well - breaking through that last ceiling for maximum fps is going to be like buying a bentley (if it isn’t already) where as before it was just opting for the v8
Nvidia's been doing this for a while now, since at least the Titan cards and technically the SLI/Crossfire craze too. If you sell it, egregiously-compensated tech nerds will show up with a smile and a wallet large enough to put a down-payment on two of them.
I suppose you could also blame the software side, for adopting compute-intensive ray tracing features or getting lazy with upscaling. But PC gaming has always been a luxury market, at least since "can it run Crysis/DOOM" was a refrain. The homogeneity of a console lineup hasn't ever really existed on PC.
I've used all of these (at 4K, 120hz, set to "balanced") since they came out, and I just don't understand how people say this.
FSR is a vaseline-like mess to me, it has its own distinct blurriness. Not as bad as naive upscaling, and I'll use it if no DLSS is available and the game doesn't run well, but it's distracting.
Lossless is borderline unusable. I don't remember the algorithm's name, but it has a blur similar to FSR. It cannot handle text or UI elements without artifacting (because it's not integrated in the engine, those don't get rendered at native resolution). The frame generation causes almost everything to have a ghost or afterimage - UI elements and the reticle included. It can also reduce your framerate because it's not as optimized. On top of that, the way the program works interferes with HDR pipelines. It is a last resort.
DLSS (3) is, by a large margin, the best offering. It just works and I can't notice any cons. Older versions did have ghosting, but it's been fixed. And I can retroactively fix older games by just swapping the DLL (there's a tool for this on GitHub, actually). I have not tried DLSS 4.
Most people either can’t tell the difference, don’t care about the difference, or both. Similar discourse can be found about FSR, frame drop, and frame stutter. I have conceded that most people do not care.
10 years ago, $650 would buy you a top-of-the-line gaming GPU (GeForce GTX 980 Ti). Nowadays, $650 might get you a mid-range RX 9070 XT if you miraculously find one near MSRP.
That is $880 dollars in today's term. And 2015 Apple was already shipping a 16nm SoC. The GeForce GTX 980 Ti was still on 28nm. Two generation Node behind.
Not quite $500, but at $650, the 9070 is an absolute monster that outperforms Nvidia's equivalent cards in everything but ray tracing (which you can only turn on with full DLSS framegen and get a blobby mess anyways)
AMD is truly making excellent cards, and with a bit of luck UDNA is even better. But they're in the same situation as Nvidia: they could sell 200 GPUs, ship drivers, maintain them, deal with returns and make $100k... Or just sell a single MI300X to a trusted partner that won't make any waves and still make $100k.
Wafer availability unfortunately rules all, and as it stands, we're lucky neither of them have abandoned their gaming segments for massively profitable AI things.
Some models of 9070 use the well-proven old style PCI-E power connectors too, which is nice. As far as I'm aware none of the current AIB midrange or high end Nvidia cards do this.
I went from a 2080 Ti to a 5070 Ti. Yes it's faster, but for the games I play, not dramatically so. Certainly not what I'm used to doing such a generational leap. The 5070 Ti is noticeably faster at local LLMs, and has a bit more memory which is nice.
I went with the 5070 Ti since the 5080 didn't seem like a real step up, and the 5090 was just too expensive and wasn't in stock for ages.
If I had a bit more patience, I would have waited till the next node refresh, or for the 5090. I don't think any of the other current 50-series cards are worth besides the 5090 it if you're coming from a 2080. And by worth it I mean will give you a big boost in performance.
I went from a 3070 to 5070 Ti and it's fantastic. Just finished Cyberpunk Max'd out at 4k with DLSS balanced, 2x frame gen, and reflex 2. Amazing experience.
5 or maybe 10 years ago, high-end GPU are needed to run games at reasonably eye candy setting. In 2025, $500 mid-range GPUs are more than enough. Folks all over can barely tell between High and Ultra settings, DLSS vs FSR, or DLSS FG and Lossless Scaling. There's just no point to compete at $500 price point any more, that Nvidia has largely given up and relegating to the AMD-built Consoles, and integrated graphics like AMD APU, that offer good value in low-end, medium-end, and high-end.
Maybe the rumored Nvidia PC, or the Switch 2, can bring some resurgence.