Hacker News new | past | comments | ask | show | jobs | submit login
GeForce RTX 5090 (twitter.com/kopite7kimi)
27 points by doener 4 months ago | hide | past | favorite | 29 comments



600W for a customer product, when will this stop, 10 years from now we'll get 1200w pc to play games...


True .. it’s crazy but what’s interesting is the 5090 core count is 2x that of the 5080 (of course these are rumored specs). So with the 5090 you’re getting 2 5080s in almost every sense: 2x the ram, 2x the cores, 2x the bandwidth for 150w more in draw (450 vs 600) or for only 1/3rd more wattage.

So if they don’t gimp it too much for AI/ML things it could be a beast.

I think it’ll cost north of 3k and given supplies and stuff retail will likely be closer to 4k+


10 years? Multiple vendors have 1600w PSUs on market right now.

What's more, you'll need an AC to play pc games. My home office room is ~6m2, and even the current 500w machine I have there can noticeably bump the temperatures under load.


That could be a very good thing, depending on which part of the planet you live.

For example, I'm in Scotland, and apart from one or two days, we've basically had a "year without a summer", with cooler than normal temps. ;)


> 10 years? Multiple vendors have 1600w PSUs on market right now

We're going to need dedicated electrical circuits soon


Why do you need 1600W PSU for 600W card?


The PSU also powers the motherboard (keep in mind all those power hungry USB-C PD peripherals), CPU, cooling (fans and AIO liquid cooling pumps), storage, and RAM. Furthermore, when building a PC one would keep some buffer of a few hundred watts in the event that one would want to down-the-road add another GPU (or two (particularly for ML work)) to spare PCIe slots of the motherboard, or simply future-proof the PC for future more-power-hungry GPUs.


(USA) I guess I better call an electrician to install one of those huge 240v stove/clothes dryer outlets in my PC room so I can run a bigger PSU for my dual 5090s.

\s


As someone who had a 3090 because that's all I could get my hands on at the time, I would not recommend anyone consider the XX90 products. Leave those for the over-the-top, money is no object, "I'm just flexing on you" builds. XX80 should be considered the limit for people who are building a reasonable gaming PC.

I ended up switching to a 4080. It certainly doesn't sip power and is also quite expensive, but the difference is large enough that I don't feel anymore like I need to find a way to send the heat output directly outside during the summer.


Are you similarly frustrated that microwaves and coffee makers can easily take 1,200 Watts or more?


My guy, I’m not running a microwave or coffee maker for 2 to 3 hours or more at a time.


Nobody needs it. It's a luxury product. If you don't want a 600w GPU just get a different one. I only recently upgraded from the 1070 that's served me well for almost a decade. Didn't really notice much of a difference even though I splurged a bit.

Buying top end gpus these days seems to me like a total waste of money unless you have very specific needs. For playing any mainstream game at 1440p you'll be completely fine with a mid-tier previous-gen card.


Well think of it this way, why should gpu manufacturers only build for the lowest common denominator? If someone wants a 600W chip shouldn't they be able to get one?

On the other hand, game studios can obviously not design their games around such chips.


I expect it to be $2000+ MSRP. They don’t have to worry about AMD competition in the high-end. (They didn’t last time either, but there were strong rumors they would and they had the 4090 Ti ready for that eventuality). They’ll also be aware of its value for AI and what people are willing to pay for it.


Most North American homes have 15 amp circuits for plugs (really only capable of handling 12 safely) and the best PSUs have a power factor of 0.9, so we're looking at a total max safe load of 1296W (12A x 120V x 0.9).

At the rate these components demand more and more power, we're years away from needing a dedicated power circuit for our gaming computers. Pretty sad really.


Each morning it’s a struggle to get my 12 year old awake for school. This morning I told him I saw an article about a rtx5090 and he popped straight up in bed haha.


Who is this and why should I trust them?


A quite reliable leaker, maybe the most prominent of all. Probably works for a GPU manufacturer like Asus, MSI, Gigabyte etc.


Every new generation of GPU seems larger and more power-hungry than their predecessor. Is there any effort to produce faster GPUs that are smaller and use less power?


Yes.

Every new GPU generation delivers more performance/W than the previous generation (mostly due to semiconductor process improvements) But the competition to have the fastest chip means that total power usage keeps growing.

Power efficient chips just doesn't generate headlines the same way.


And it doesn't help that game companies have little incentive to obsess over squeezing the absolute most performance out of the hardware when customers care more about "Jaw dropping visuals" and raytracing than knowing that the game can pull 60fps on 10 year old hardware.


There hasn't been competition to have the fastest chip for 4 generations.


We observe larger and more power hungry at the high end, because those models are selected for maximum performance - at the expense of everything else.

But GPU power efficiency (gflops/watt) has been increasing at an exponential pace as well - over the last decades it has been consistent doubling every 2-3 years. So over 10 years we have had 10-30x improvements in power efficiency. The same is true for cost efficiency (gflops/dollar).


I think the limitation here is more with the process nodes then with the GPU architecture


A multi graphics chiplet GPU could be more efficient


That’s a pretty insane power draw if true. What would you even do with that power? The best graphics engine is still probably on a game that’s 3 years old by now.


Craving for imac 32" leak. Googling every week!


Don't expect it to come with a 5090.


Oh, but it will come with a built-in GPU that “beats competing products on performance per watt”, as shown in an unlabeled chart.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: