Maybe it makes sense inside the Apple reality distortion field, but in the rest of the world the monitors job is to represent colours the way it can from the current black to the highest possible white, utilizing its complete dynamic range, and it's the tonemapper's job to convert HDR to monitor colours.
I imagine very few people, i.e. graphics designers, want true sRGB colours. The rest (i.e. normal people) adjust the brightness to the ambient conditions, adjust their eyes to the current "white" and expect everything to follow suit.
> Maybe it makes sense inside the Apple reality distortion field
This is not Apple specific, and not what HDR is designed for. No implementation works as you expect. Linux doesn't even support HDR at all. When I plug in my non-Apple HDR monitor into my Linux desktop the brightness remains capped at SDR since there is no hardware support. Even when Linux does eventually support the hardware, it is unlikely that any UI will use HDR by default. The UI would need to be redesigned for HDR specifically. It is way too bright for sustained usage on a UI. Uncomfortably bright to the point where it may cause damage to eyesight with prolonged use. It is intended for dynamic scenes that are occasionally bright in some parts of the image, as in TV and movies. I would never use that as a default global brightness level.
I imagine very few people, i.e. graphics designers, want true sRGB colours. The rest (i.e. normal people) adjust the brightness to the ambient conditions, adjust their eyes to the current "white" and expect everything to follow suit.