Hacker News new | past | comments | ask | show | jobs | submit login

26% at 1080, which no one buying a $500+ CPU every generation is running. The benefits will be minimal at 2k and 4k, with the same GPU, and a decent processor within the last 5-10 years.



While I agree with the sentiment here... this has been the argument Intel has been using as a "last excuse to buy Intel over AMD" - if you buy a fast enough video card, but play your games at 1080p on a really high refresh rate monitor... the gaming performance was better on Intel.

So AMD focuses on it here to say "look, your very last excuse for choosing Intel over AMD is no longer invalid."

Of course I do more "non-gaming" than gaming, so it wasn't very important to me in the first place, and I don't spend enough on a graphics card for this to matter. But I want a lot of cores for fast compilation and great multi-tasking with containers and virtual machines.


The AMD software to control CPU doesn't work with virtualisation enabled. They have been ignoring requests to fix it for years.


What does it control on the CPU? Overclocking? I never needed to control CPU via a software...


Yes of course, non gaming, dev and media work, these will be beast. I was just confused on why they were so focused on 1080p gaming performance benefits. Buying a whole new setup if you're running any relatively modern CPU would be a waste of money for gaming. At 1080 you're probably already killing it in framerate, and at 2k+ the benefits just aren't there.


At 1080p, with a 2080 Ti (AMD's Testing Rig), the GPU is not a bottleneck so any performance difference = CPU performance difference.

Going higher than 1080p, GPU starts to become a bottleneck so you're no longer comparing CPU-to-CPU, but (CPU+GPU)-to-(CPU+GPU).

Testing game performance at 1080p is a well-known and well-accepted way to compare CPU performance.


Because at 1080p you can actually compare the performance maybe?


What's the difference between 1080p and 2K?


2K has no official designation and is sometimes used to describe 1440p.


The higher the resolution you run a game at the more likely it is that the GPU becomes the bottleneck for the frame rate.


Generally GPU load scales with resolution and graphical fidelity, while CPU load mostly just scales with framerate irrespective of resolution or graphics settings - so you might be CPU bottlenecked at max settings 1080p with an average CPU and a mid-high end GPU, but even with the current highest end GPUs and a mid range CPU you're likely not bottlenecked by the CPU at 1440p or 4k because the GPU isn't pushing out as many frames.


Nearly the same thing, give or take a few percent. (https://en.wikipedia.org/wiki/2K_resolution)


78% more pixels at 2560x1440, also the performance sweet spot for high end GPUs.


2560x1440 is often called "2k"


Nothing, AFAIK




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: