I feel lucky when decided to buy a Brother printer instead of HP. Super easy to setup, press the WPS button once to connect to WI-FI and I can start printing from my Ubuntu laptop without install anything.
Last time I heard Fedora 37 will remove hardware acceleration for decoding video via VA-API. Not sure if there is a workaround yet, but I image it will be a deal breaker for many people.
I have the same concern and it's a major one for me. I don't like to switch to distros that destroy such basic QoL functionality as HW decode and turn your laptop into loud battery draining stovetop when watching a Youtube tutorial, without also providing a quick and easy workaround. And I have not found any quick and easy fix on how to revert this (all search results for me yield just threads of people being annoyed with this decision rather than solutions) which is why I'll be staying away from vanilla Fedora for now despite being a favorite distro of mine. Worse, my other favorite distro, OpenSUSE, will follow suit on this, despite being a EU company not affected by the US SW patents issue.
The super ironic thing is that, after the Linux community preaching for so many years that "AMD works best with Linux, always buy AMD, f*ck Nvidia", only AMD users are affected by the axing of VA-API, as Nvidia and Intel have alternative/proprietary APIs for hardware decoding to fall back to which can be easily enabled out of the box, versus the "just works™" AMD users which are currently screwed by this.
I hope G.E. from the Nobara Project[0] will fix this and give us a great Fedora 37 spin with batteries included.
This really depends on the size of the screen. For most laptops, I'd argue that 4k is useless; 1440p is high-res enough that you get perfect clarity anyway.
For 83" TVs you obviously need more pixels to get a good viewing experience because you don't position yourself several meters in front of the screen to make the pixels unnoticable.
As an aside, many 4k/1440p videos are not really 4k or 1440p, vs how we would classify a 4k/1440p monitor. Video is mostly 4k/1440p light levels, but less than that w.r.t. color information.
In addition, common video compression techniques (at least the simple ones that I could still understand) basically reconstruct blocks of pixels. Reconstruction is lossy, at best.
If I had to guess, something similar could be done in games as part of their optimizations.
For me, there's only a difference for the text console. For GUIs I have to halve the resolution to make things usable so it doesn't really help anything.
That's great, one less plugin to install.