It's complete garbage and not worth buying. It's so cut down it's nearly useless outside of web browsing and very light games. The price is also effectively lie, it's going to be hard to get it for less than $300. Once we get some proper 3rd party test data in I'd be shocked if it's 5% better than a 4050 in raster without the use of fake frames.
NVidia themselves have said that framegen shouldn't be used if the card isn't hitting 60 FPS to start with because of the latency it introduces. If the card is cut down enough that it's struggling to hit 60 FPS in games, enabling framegen will do more harm then good.
You can feel additional latency easily in competitive FPS or high speed arcade racing games.
You can feel less than 50-60 fps on a management game where you only interact with the UI and move the camera around, not game breaking but doesn't feel great. And I used to play far cry 3 and CSGO at ~25 fps, I'm used to lack of performance.
Fake frames have a big latency penalty, because you can't generate a frame between X and Y until you have Y. At the point that you have generated frame Y, however many frames you insert give you that much additional latency, beyond whatever your display adds.
I guess I can see some utility in situations where latency is not a major factor, but IMHO, that pushes out most gaming.
I believe DLSS frame gen predicts future frames (X_1, X_2, X_3) given (X_0, X_-1, X_-2, ...), without waiting for X_4. At least that's the impression I get from their marketing.
Yeah, but there's still a latency penalty, because X_1, X_2, X_3 won't respond to player input, so your effective latency is still that of your 'real' FPS, and that's lower than without because the frame gen takes a good fraction of GPU resources.
Nvidia Reflex 2 does that with async frame warp (that has been used in VR for a while now), but it's separate from DLSS, and is not supported in many games.
You absolutely can tell the difference. DLSS (upscale) visually is massively different in some games. Sometimes it works great, sometimes the result is very ugly. I've tested with several of my favorites.
And generated frames are far worse than that. If you're running at a very high base framerate (100+) then they can look OK but the moment the frames get any further apart the visual quality starts to tank.
because you can tell the difference, they have quite a few artifacts, and they make latency worse which is especially problematic in the scenarios where you need the "performance" offered by fake frames. At this price point it's that last thing that's especially problematic. You may get 60fps in an fps counter with dlss 4, but it'll feel like 15-20fps and not be very playable
I can't believe that nobody has yet mentioned the Intel Arc Battlemage B580. Same $250 MSRP (which has inflated, but every other GPU is inflated too, and the 5050 will probably inflate as well), but has 12 GB of VRAM and bats just below a 4060 Ti 16 GB[0].
I have to assume things are better to some degree but last I looked at Intel's offering the support was still unacceptably bad. That said, I really hope they can get things to a good state because there needs to be more competition at this price point.
The support is still worse, but you're getting a big discount on the hardware by comparison. So it kinda evens out at this price point where you're deciding between either having every game run badly or most, but not all, games running decently
I've been pretty happy with my Arc A770 LE (16 GB). The drivers were rough at launch, but they've gotten much better, and at the time it was the best performance $350 could buy.
I had both an A580 (not an A770, but at least something from that generation) and then later a B580, at one point even both in the same computer, side by side, when I wanted to use one for games and the other for encoding:
When paired with a worse CPU like a Ryzen 5 4500, the experience won't always be good (despite no monitoring software actually showing that the CPU is a bottleneck).
When paired with a better CPU (I got a Ryzen 7 5800X to replace it, eventually with an AIO cause the temperatures were too high under full load anyways), either of them are pretty okay.
In a single GPU setup either of them run most games okay, not that many compatibility or stability issues, even in older indie titles, though I've had some like STALCRAFT: X complain about running on an integrated GPU (Intel being detected as such). Most software also works, unless you want to run LLMs locally, where Nvidia will have more of an advantage and you'd go off the beaten path. Most annoying I've had were some stability issues near the launch of each card, for example running the B580 with their Boost functionality on in their graphics software sometimes crashed in Delta Force, no longer seems to be an issue.
Temperature and power draw seem fine. Their XeSS upscaling is actually really good (I use it on top of native resolution in War Thunder as fancy AA), their frame generation feels like it has more latency than FSR but also better quality, might be subjective, but it's not even supported in that many games in the first place. Their video encoders are pretty nice, but sometimes get overloaded in intensive games instead of prioritizing the encoding over game framerate (which is stupid). Video editing software like DaVinci Resolve also seems okay.
The games that run badly are typically Unreal Engine 5 titles, such as S.T.A.L.K.E.R. 2 and The Forever Winter, where they use expensive rendering techniques and to get at least 30 FPS you have to turn the graphics way down, to the point where the games still run like crap and end up looking worse than something from 5 years ago. Those were even worse on the A series cards, but with the B series ones become at least barely playable.
In a dual GPU setup, nothing works that well, neither in Windows 11, nor Windows 10, neither with the A580 + B580, nor my old RX 580 + B580: system instability, some games ignoring the Intel GPU preference being set when an AMD one is available, low framerates when a video is playing on a secondary monitor (I have 4 in total), the inability to play games on the B580 and do encoding on the A580 due to either just OBS or also the hardware not having proper support for that (e.g. can't pick which GPU to do encode on, like you can with Nvidia ones, my attempts at patching OBS to do that failed, couldn't get a video frame from one GPU to the other). I moved back to running just the B580 in my PC.
For MSRP, I'd say that the Intel Arc B580 is actually a good option, perhaps better than all A series cards. But the more expensive it gets, the more attractive alternatives from AMD and Nvidia become. Personally wouldn't get an A770 unless needed the VRAM or the price was really good.
Also I’m not sure why the A580 needed two 8-pin connectors if it never drew that much power and also why the B580 has plenty of larger 3 fan versions when I could never really get high temps when running Furmark on the 2 fan version.
5800X is a 105W part so should be quite fine with air cooling still. I just built 9950X3D (170W) with air cooling and it's plenty enough for that too, temperatures under load are mostly in the 70s, stress test gets it up to 85C.
Without the side panel, the temps are like 10-15C lower than with the side panel, so without they go up to about 78C under full load but do hit 90C and the clock frequencies are dialed back with the panel on.
That is already with a CO value of -10 across all cores.
I will probably need a different case altogether, or just get rid of the solid front panel (those vents on it are too small) and replace it with a custom mesh.
Thankfully, for now, in CPU-Z the scores are ~6500 without the side panel and ~6300 with the panel, so with the AIO and more powerful fans on it, it's pretty close to working optimally, even if not quite there yet.
I also tried it with 5x120mm case fans and an air cooler, it was slightly worse than the AIO. Also tried multiple different thermal pastes, didn't make much of a difference. Might also just be cursed and have ghosts in it, go figure.
Yep I guess the case is the limiting factor then, no CPU cooler can do much if the case traps the hot air inside. Though 5 fans should be enough to force quite some air to move already.
I had a fully new build so used one of the well reviewed Fractal cases to get good airflow, with 5x140mm case fans.
> x50-class GeForce GPUs are among the most popular in the world, second only to the x60-class on Steam. Their price point and power profile are especially popular:
> For anyone upgrading an older x50-class system
> Each GeForce RTX 5050 graphics card is powered by a single PCIe 8-pin cable, drawing a maximum of 130 Watts at stock speeds, making it great for systems with power supplies delivering as little as 550 Watts.
The 1050, 2050 and 3050 were all bus-powered cards. I doubt 95% these systems even have the cable coming from their power supply. Imagine all the poor saps that excitedly swap out their old card for this, and... nothing works.
I've got at 1650 Super; it's not bus-powered either. I think it's got a 6-pin, but often you can plug a 6-pin into an 8-pin board and it'll just run a lower current limit (this might not be accurate --- a lot of internet comments say 8 pin boards will detect a 6-pin connector and refuse to work). A whole lot of modern computing gets ~ 90% of the performance with 50% of the power; so if using a 6-pin lead drops power to 50%, you would still get most of it.
I've got a ~ 2006 380W power supply hanging out near my desk and it's got a 6-pin pci-e cable; I really don't think people won't have at least that, certainly not 95% of systems with a pci-e x16 slot.
To bolster this, after the 750ti the 50 products have had pretty lame price to performance compared to the next step up, but have remained quite popular. Most people seem to argue that the lack of additional power is their main advantage and why they are popular.
I personally think people remember being happy with the 750ti and just keep buying those cards.
And it's 8GB of last-gen GDDR6 video memory, the exact same as the $249 RTX 3050 from three years ago (same number of CUDA cores too). Technically with inflation that's more per dollar, I guess, but that's not super appealing.
The charts are from the Verge, not exactly known for their integrity in regards to anything.
It's also with DLSS on, so you could just as easily have the framerate be 100 FPS, 1000 FPS, or 10000 FPS. The GPU doesn't actually have to render the frame in that case, it just has to have a pixel buffer ready to offload to whatever hardware sends it over the link to the display. Apparently some people actually really like this, but it isn't rendering by any reasonable definition.
This is creative marketing from nVidia. Notice the "With DLSS 4".
That's AI frame hallucination which the 5050 has.
Without the DLSS, the numbers from independent reviewers has basically been exactly on par with the previous generations (about 10% increase in performance).
That's Nvidia's marketing slide and if you note the fine print they are tested at different settings. The RTX 5050 is using 4x frame gen which the 3050 isn't. Techpowerup has the RTX 5050 as being 20% faster than the 3050 give or take, which is certainly not enough to justify upgrading
If you're using less memory, it kinda stands to reason that you can get more mileage out of less bandwidth. I'd be really upset if this was a 16gb or 24gb card, but we've been using GDDR6 for 8gb cards without issues for years now.
I agree that it's not super appealing, but Team Green has to hit the low price points somehow. This feels more like a hedged bet against Intel trying to muscle their way back into the budget market.
Undoubtedly a better system, but the 395 variant with a full 128GB of (soldered on) RAM, you're looking at ~$2k for the system. Comparing that to a $250 dGPU (that arguably isn't even worth that) is a very "apples to oranges" comparison.
Coming from a 2060 Super, would this be a good upgrade? I dont really play newer high demand games, but i do enjoy my emulation. Currently on 2060 super and dont really have any issues with emulation.Ryzen 5 3600X / linux (of course :))
It'd likely be a side-grade, unless you care specifically for the exact features that were introduced with 30/40/50 series (such as increasingly elaborate upscaling, other AI-driven features.)
Although we don't know how 5050 will perform, 50 series have roughly ~same perf in render as models from 40 series at the same tier. 40 series in turn are only a mild bump over 30 at the same tier. And 30-series was a reasonable improvement over 20, but mostly in perf/$ measure and not raw perf. Extrapolating, 5050 is likely not going to give much of a boost if any, and spending money on a 8GB card in 2025 is just throwing money away at this point as software is now increasingly expecting to be able to work with >8GB of VRAM.
It will take Nvidia 10 years to release the firmware for PMU and then they will cancel it because it's "too old". Just like the they did with pascal, P520 and other perfectly working hardware that are barely usable to this day.
12VHPWR has to be one of the weirdest industry decisions I've seen in a while. So far I thought I had been able to avoid it, but recently bought a power supply that uses it on the modular cable connector.
But it isn't really that uncommon either, I had a suzuki motorcycle that used a connector with 15 amp pins to handle 30 amps of current on one pin. I eventually concluded the only reason that connector was in the harness was to ease assembly and just cut it out entirely and soldered the junction together.
reply