Is there any reason why most TVs still don't support USB C display port and just support HDMI?
With the recent drama around HDMI vs. Foss drivers on Linux, I'm curious why haven't we seen that bigger push from TV vendors to support USB C display port.
Most of my monitors work with USB C, Even consoles like steam deck, a lot of high end phones seem to support USB C.
So... Are there any features of HDMI that USB C display port doesn't support?
DisplayPort in native mode lacks some HDMI features such as Consumer Electronics Control (CEC) commands. The CEC bus allows linking multiple sources with a single display and controlling any of these devices from any remote. DisplayPort 1.3 added the possibility of transmitting CEC commands over the AUX channel. From its very first version HDMI features CEC to support connecting multiple sources to a single display as is typical for a TV screen.
The other way round, DisplayPort's Multi-Stream Transport allows connecting multiple displays to a single computer source.
This reflects the facts that HDMI originated from consumer electronics companies whereas DisplayPort is owned by VESA which started as an organization for computer standards.
DisplayPort->HDMI was the only way I could get my PC to control the TV it was attached to over CEC, since (apparently) the CEC pins aren't enabled on most consumer GPUs, but the DisplayPort aux pin can be used as an alternative.
So the linux DisplayPort driver at least supports the feature, whether or not it's fully standardized/required.
DRM is pretty much mandatory in HDMI in home entertainment setup, and is controlled by the media companies' cabal, which is important to vendors in that space (who are often members too).
Also, lowest common denominator HDMI in TV format has HDCP decoder as the only complex part, you can theoretically drive a dumb panel with few fixed function ICs that aren't even related to HDMI. Simplest dumbest case you can advertise the bare minimum supported display setting in EDID, and plop few buffers and ADCs and drive an analog TV off it.
Meanwhile simplest possible DisplayPort implementations still requires that your display can handle packetized data sent over PCI-E PHY layer, with more complex data structures and setup information than "plop an I2C rom chip here, few ADCs and buffer logic here, and you've got HDMI without HDCP to analog CRT".
Which is why in my experience on computer hw it's common to see one, maybe two HDMI (among other reasons, to support things like connecting a console to your monitor), and multiple display port connectors - sometimes it goes ridiculous - a maximalist approach to counting on my laptop, as currently docked, from the point of "ports implemented in GPU" if not actually available physically... gives me one HDMI and 15 DisplayPort, with the HDMI being routable onto a bunch of options. Of course that's a silly comparison, but depending on how licensing is worded I would be really unsurprised if computer manufacturers optimized for lower HDMI port count.
Also, since older HDMI was physically compatible with DVI ports, a lot of computers preferred to keep DVI ports...
>Is there any reason why most TVs still don't support USB C display port and just support HDMI?
Personally I’m convinced it’s mostly because the display manufacturers want to discourage the use of TVs as monitors, in order to protect their margins on monitors.
8k monitors should be sub 1000 usd by now and standard for anyone working with screens. You can get that as a tv but not as a monitor. :(
With the recent drama around HDMI vs. Foss drivers on Linux, I'm curious why haven't we seen that bigger push from TV vendors to support USB C display port.
Most of my monitors work with USB C, Even consoles like steam deck, a lot of high end phones seem to support USB C.
So... Are there any features of HDMI that USB C display port doesn't support?