26% at 1080, which no one buying a $500+ CPU every generation is running. The benefits will be minimal at 2k and 4k, with the same GPU, and a decent processor within the last 5-10 years.
While I agree with the sentiment here... this has been the argument Intel has been using as a "last excuse to buy Intel over AMD" - if you buy a fast enough video card, but play your games at 1080p on a really high refresh rate monitor... the gaming performance was better on Intel.
So AMD focuses on it here to say "look, your very last excuse for choosing Intel over AMD is no longer invalid."
Of course I do more "non-gaming" than gaming, so it wasn't very important to me in the first place, and I don't spend enough on a graphics card for this to matter. But I want a lot of cores for fast compilation and great multi-tasking with containers and virtual machines.
Yes of course, non gaming, dev and media work, these will be beast. I was just confused on why they were so focused on 1080p gaming performance benefits. Buying a whole new setup if you're running any relatively modern CPU would be a waste of money for gaming. At 1080 you're probably already killing it in framerate, and at 2k+ the benefits just aren't there.
Generally GPU load scales with resolution and graphical fidelity, while CPU load mostly just scales with framerate irrespective of resolution or graphics settings - so you might be CPU bottlenecked at max settings 1080p with an average CPU and a mid-high end GPU, but even with the current highest end GPUs and a mid range CPU you're likely not bottlenecked by the CPU at 1440p or 4k because the GPU isn't pushing out as many frames.