> Your display is likely only 720p, or 1080p, but a 4K video on youtube will still look a lot better, although technically it should have no visible difference.
Not at all, bitrate differences are audible.
> And just like you need 4K 4:2:0 mp4 video to get even close to the quality of uncompressed 1080p 4:4:4 video, you also need far higher sampling rate and depth of highly compressed audio to get a quality similar to 16bit/44.1kHz PCM.
Even with the same bitrate, you’ll need 4K video to get quality comparable to uncompressed 1080p.
For example, mp4 defines that the brightness channel Y should be stored with full resolution, and the color channels Pb Pr should be stored at half resolution. So in a 4K mp4 video at lossless bitrate, the actual colors will only be stored at 1080p. This functionality is called Chroma Subsampling.
Audio Codecs do something very similar, causing these exact issues.
That’s again the same issue. Nice theory, and theoretically is true, but all real algorithms also subsample high frequencies, and scale them back up with nearest-neighbor.
The same issue happens with audio. Nice theory, completely broken realistic implementations.
It's true that nearest neighbor is bad, but the important part is the YUV->RGB conversion. It distributes the missing high frequencies out of the Y channel into all three RGB images.
Not at all, bitrate differences are audible.
> And just like you need 4K 4:2:0 mp4 video to get even close to the quality of uncompressed 1080p 4:4:4 video, you also need far higher sampling rate and depth of highly compressed audio to get a quality similar to 16bit/44.1kHz PCM.
…how so?