Couldn't all of the protections on current delivery be implemented in the interface on the other end of the cable, though? This is the approach taken by basically every other standard for negotiated power delivery. Even things like 802.3af which manage to deliver 48v at nearly an amp and with the complicating case of many valid devices not really tolerating any current delivered.
The other end being USB-A power adapters? There’s a vast number of those of varying quality. Some have zero logic and just output 5V/1A. No way they could be trusted for this purpose.
just as an example, imagine you end up putting a fuse in the cable. If the fuse blows, then the cable can be easily replaced. A chip in the iPhone.... not so much.
Though likely there's something in the phone _as well_ as the cable.
Those cheap power supplies are ubiquitous, so it's more about Apple building their cables to exist in the built environment than dealing with the fallout of consumers busting up their expensive phones by charging them on whatever's lying around.
Guarantee there was a consumer study where they compared reactions between these scenario.
The manufacturers knows that if it’s their product in contact when a user is shocked, they will get some blame even if it’s “really” the sketchy power supply’s fault. And since some cables have these safety features already, anyone making cables without them will appear extra negligent when bad things happen.
Isn't direct current component ignored by default just because Ethernet devices are galvanically isolated? I thought even if you applied some DC voltage to Ethernet without proper PoE negotation, nothing would happen at all.