Their idea is that you won't ever need your UI to be this bright, I guess? It's Apple, that's what they do — they build things that work optimally for most people. They aren't wrong about it in this particular case either. The MacBook Pro display does get bright enough for me as is to be readable in direct sunlight.
But if you do want to "use the full potential of your hardware", there was some third-party app that used private APIs to set the screen brightness above that limit. I don't remember its name.
This sounds incredibly stupid to me, giving some unspecified range of content providers access to things that users don't have access to. But that's Apple for you I guess.
The problem is when this was first rolled out no content was designed for it.
So if they they just mapped the new brightness everything, everywhere, would look wrong. And people would complain that the iPhone is broken. And they have to redo all their websites/apps. And when they do, they look wrong on every other device.
This is the only sane way. It has to be something people opt-in to. That’s what Apple did.
The colors are meant to be within a calibrated (sRGB) color space. Images and videos can request to be in a different color space which includes the HDR range. CSS can also request colors outside that space, but that is done by using extended RGB triples (e.g. RGB(999,999,999) for ultra white).
The difference is while most things support sRGB, those other color spaces may just be outside what the display can handle. My Ultrafine 5k for instance does not show a discernible difference between the two QR codes.
You also have the issue that static images displayed at higher brightness will use more power and require quicker mitigations to prevent burn-in, so an 'ultra white' background may just not be something supported for a web page.
> The colors are meant to be within a calibrated (sRGB) color space.
If this were true, monitor brightness would be hard-set at 80cd/sqm, which would be borderline unusable during the day and way too bright in the dark. But hey, true sRGB colours!
What do you mean by "some unspecified range of content providers"? You can edit your own HDR videos on it too. Affinity Photo also allows using the HDR mode for viewing raw photos. The APIs to sear user's retinas are there, they are public and available to all native apps, it's just that there's a very strict distinction between SDR and HDR content.
Maybe it makes sense inside the Apple reality distortion field, but in the rest of the world the monitors job is to represent colours the way it can from the current black to the highest possible white, utilizing its complete dynamic range, and it's the tonemapper's job to convert HDR to monitor colours.
I imagine very few people, i.e. graphics designers, want true sRGB colours. The rest (i.e. normal people) adjust the brightness to the ambient conditions, adjust their eyes to the current "white" and expect everything to follow suit.
> Maybe it makes sense inside the Apple reality distortion field
This is not Apple specific, and not what HDR is designed for. No implementation works as you expect. Linux doesn't even support HDR at all. When I plug in my non-Apple HDR monitor into my Linux desktop the brightness remains capped at SDR since there is no hardware support. Even when Linux does eventually support the hardware, it is unlikely that any UI will use HDR by default. The UI would need to be redesigned for HDR specifically. It is way too bright for sustained usage on a UI. Uncomfortably bright to the point where it may cause damage to eyesight with prolonged use. It is intended for dynamic scenes that are occasionally bright in some parts of the image, as in TV and movies. I would never use that as a default global brightness level.
But if you do want to "use the full potential of your hardware", there was some third-party app that used private APIs to set the screen brightness above that limit. I don't remember its name.