True .. it’s crazy but what’s interesting is the 5090 core count is 2x that of the 5080 (of course these are rumored specs). So with the 5090 you’re getting 2 5080s in almost every sense: 2x the ram, 2x the cores, 2x the bandwidth for 150w more in draw (450 vs 600) or for only 1/3rd more wattage.
So if they don’t gimp it too much for AI/ML things it could be a beast.
I think it’ll cost north of 3k and given supplies and stuff retail will likely be closer to 4k+
10 years? Multiple vendors have 1600w PSUs on market right now.
What's more, you'll need an AC to play pc games. My home office room is ~6m2, and even the current 500w machine I have there can noticeably bump the temperatures under load.
The PSU also powers the motherboard (keep in mind all those power hungry USB-C PD peripherals), CPU, cooling (fans and AIO liquid cooling pumps), storage, and RAM. Furthermore, when building a PC one would keep some buffer of a few hundred watts in the event that one would want to down-the-road add another GPU (or two (particularly for ML work)) to spare PCIe slots of the motherboard, or simply future-proof the PC for future more-power-hungry GPUs.
(USA) I guess I better call an electrician to install one of those huge 240v stove/clothes dryer outlets in my PC room so I can run a bigger PSU for my dual 5090s.
As someone who had a 3090 because that's all I could get my hands on at the time, I would not recommend anyone consider the XX90 products. Leave those for the over-the-top, money is no object, "I'm just flexing on you" builds. XX80 should be considered the limit for people who are building a reasonable gaming PC.
I ended up switching to a 4080. It certainly doesn't sip power and is also quite expensive, but the difference is large enough that I don't feel anymore like I need to find a way to send the heat output directly outside during the summer.
Nobody needs it. It's a luxury product. If you don't want a 600w GPU just get a different one. I only recently upgraded from the 1070 that's served me well for almost a decade. Didn't really notice much of a difference even though I splurged a bit.
Buying top end gpus these days seems to me like a total waste of money unless you have very specific needs. For playing any mainstream game at 1440p you'll be completely fine with a mid-tier previous-gen card.
Well think of it this way, why should gpu manufacturers only build for the lowest common denominator? If someone wants a 600W chip shouldn't they be able to get one?
On the other hand, game studios can obviously not design their games around such chips.
I expect it to be $2000+ MSRP. They don’t have to worry about AMD competition in the high-end. (They didn’t last time either, but there were strong rumors they would and they had the 4090 Ti ready for that eventuality). They’ll also be aware of its value for AI and what people are willing to pay for it.
Most North American homes have 15 amp circuits for plugs (really only capable of handling 12 safely) and the best PSUs have a power factor of 0.9, so we're looking at a total max safe load of 1296W (12A x 120V x 0.9).
At the rate these components demand more and more power, we're years away from needing a dedicated power circuit for our gaming computers. Pretty sad really.
Each morning it’s a struggle to get my 12 year old awake for school. This morning I told him I saw an article about a rtx5090 and he popped straight up in bed haha.
Every new generation of GPU seems larger and more power-hungry than their predecessor. Is there any effort to produce faster GPUs that are smaller and use less power?
Every new GPU generation delivers more performance/W than the previous generation (mostly due to semiconductor process improvements) But the competition to have the fastest chip means that total power usage keeps growing.
Power efficient chips just doesn't generate headlines the same way.
And it doesn't help that game companies have little incentive to obsess over squeezing the absolute most performance out of the hardware when customers care more about "Jaw dropping visuals" and raytracing than knowing that the game can pull 60fps on 10 year old hardware.
We observe larger and more power hungry at the high end, because those models are selected for maximum performance - at the expense of everything else.
But GPU power efficiency (gflops/watt) has been increasing at an exponential pace as well - over the last decades it has been consistent doubling every 2-3 years. So over 10 years we have had 10-30x improvements in power efficiency. The same is true for cost efficiency (gflops/dollar).
That’s a pretty insane power draw if true. What would you even do with that power? The best graphics engine is still probably on a game that’s 3 years old by now.