I was not gonna comment, but wow, they really are not backing down. From their Intel vs AMD value page:
> We don't
> Put lipstick on pigs for sponsorship fees, our users are our only sponsors.
> Care for brands: red, green or blue. PC hardware isn’t a fashion show, performance comes first.
> Test at 1440p or 4K: high resolutions are rarely optimal for gaming (refresh rate > size > resolution).
> Get fooled by the corporate army of anonymous forum and reddit influencers that prey on first time buyers.
Righto. So they don’t shill. And they know how to benchmark and measure the right things; except for the period of time when they accidentally showed AMD topping the charts and then had to adjust their expert benchmark scores. 4K gaming isn’t real; it’s a conspiracy invented by Big GPU and no gamers want it because clearly all gaming graphics is chasing higher FPS. And not only are they not shills, YOU are!
What a convincing argument. Others more reputable in the benchmarking scene considered userbenchmarks to be poorly executed to begin with, but wow, they really do not know how to take an L, at all. Of course it is convenient that the cases where AMD processors would succeed at are irrelevant.
Now I’m not playing games most of the time so a high framerate in games is hardly important to me. But who would I rather get advice from: Sour grapes userbenchmarks, or literally any other reputable benchmarking site? They inch closer and closer to blatant SEO SPAM every year.
I know Intel is not good at PR, but they really ought to pay these people... to stop making them look bad.
> Test at 1440p or 4K: high resolutions are rarely optimal for gaming (refresh rate > size > resolution).
This point especially is such nonsense. If you're only gaming at 1080p you don't even need to be looking at benchmarks, just go buy literally any current midrange GPU and enjoy your capped 144fps in esports and like 80+ in AAA games.
Their obsession with "benchmarking with the most popular games" is silly for the same reason - there's no point in comparing a top end GPU or CPU on Fortnite or CS:GO performance, because stuff that costs half as much already caps out the max framerate on a 144hz monitor on those games. (Of course there are CS:GO pros who are trying to get absurd FPS for competitive reasons, but that's a tiny fraction of players, most people aren't going to care about the difference between 300fps and 400fps on a 144hz monitor).
I really hate the whole "I need to run super high FPS to lower my input latency even though my monitor can only display 1/3 of that; vsync is bad" people.
They're not wrong in that you need that level of GPU to get a particular level of input latency, but the "reason" that you need to run at such a higher framerate than your display's refresh rate is because those games have absolutely garbage frame pacing.
Take a chapter out of the desktop compositor world - if you (the application) knows you're rending at 3x the monitor's refresh rate, wait to sample the inputs and render until 2/3 of the frame interval has passed. Save your GPU the effort (and power consumption).
https://cpu.userbenchmark.com/AMD-Ryzen-9-5900X/Rating/4087