So I had to use a calculator to help me here, and I used https://toolstud.io/video/bitrate.php, but apparently the raw bitrate for 4K@25fps/24bit is 4.98Gbps, which then obviously gets compressed by various codecs.
Taking the above 4K@25fps/24bit and pumping it up to 60fps and 36bit colour (i.e. 12 bits per channel, or 68 billion colours, 4096x as many colours as 24bit, and 64x as many colours as 30bit) the resulting raw video bitrate is 17.92Gbps... so it's an increase of <checks notes> about 3.6x.
It seems quite unlikely that we'll have every other aspect of 36bit/60fps video sorted out, but somehow the codecs available have worse performance than is already available today.
My understanding is that today's HDR sensors and displays can do ~13 stops of dynamic range, while humans can see at least ~20, though I'm not sure how to translate that into how much additional bit depth ought to be needed (naively, I might guess at 48 bits being enough).
I don't see why we'd stop at 60fps when 120 or even 240 Hz displays are already available. Also 8k displays already exist. The codecs also have tunable quality, and obviously no one is sending lossless video. So we can always increase the quality level when encoding.
So it's true in 2023 (especially since no one will stream that high of quality to you), but one can easily imagine boring incremental technology improvements that would demand more. There's plenty of room for video quality to increase before we reach the limitations of human eyes.
Taking the above 4K@25fps/24bit and pumping it up to 60fps and 36bit colour (i.e. 12 bits per channel, or 68 billion colours, 4096x as many colours as 24bit, and 64x as many colours as 30bit) the resulting raw video bitrate is 17.92Gbps... so it's an increase of <checks notes> about 3.6x.
It seems quite unlikely that we'll have every other aspect of 36bit/60fps video sorted out, but somehow the codecs available have worse performance than is already available today.