Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is there any reason why most TVs still don't support USB C display port and just support HDMI?

With the recent drama around HDMI vs. Foss drivers on Linux, I'm curious why haven't we seen that bigger push from TV vendors to support USB C display port.

Most of my monitors work with USB C, Even consoles like steam deck, a lot of high end phones seem to support USB C.

So... Are there any features of HDMI that USB C display port doesn't support?



DisplayPort in native mode lacks some HDMI features such as Consumer Electronics Control (CEC) commands. The CEC bus allows linking multiple sources with a single display and controlling any of these devices from any remote. DisplayPort 1.3 added the possibility of transmitting CEC commands over the AUX channel. From its very first version HDMI features CEC to support connecting multiple sources to a single display as is typical for a TV screen.

The other way round, DisplayPort's Multi-Stream Transport allows connecting multiple displays to a single computer source.

This reflects the facts that HDMI originated from consumer electronics companies whereas DisplayPort is owned by VESA which started as an organization for computer standards.


> From its very first version HDMI features CEC

That depends on whether or not you consider single-link DVI-D as the "very first version" of HDMI.


DP has a different management interface that's actually better because it's a standard unlike CEC.


I imagine CEC could be added to DisplayPort.


DisplayPort->HDMI was the only way I could get my PC to control the TV it was attached to over CEC, since (apparently) the CEC pins aren't enabled on most consumer GPUs, but the DisplayPort aux pin can be used as an alternative.

So the linux DisplayPort driver at least supports the feature, whether or not it's fully standardized/required.


I don't know how true it is, but I heard that

- TV manufacturers are often (always?) members of the HDMI consortium, meaning they financially profit from each device that has an HDMI port.

- Manufacturers of devices with HDMI ports are heavily discouraged from also including competing ports like DP.


DRM is pretty much mandatory in HDMI in home entertainment setup, and is controlled by the media companies' cabal, which is important to vendors in that space (who are often members too).

Also, lowest common denominator HDMI in TV format has HDCP decoder as the only complex part, you can theoretically drive a dumb panel with few fixed function ICs that aren't even related to HDMI. Simplest dumbest case you can advertise the bare minimum supported display setting in EDID, and plop few buffers and ADCs and drive an analog TV off it.

Meanwhile simplest possible DisplayPort implementations still requires that your display can handle packetized data sent over PCI-E PHY layer, with more complex data structures and setup information than "plop an I2C rom chip here, few ADCs and buffer logic here, and you've got HDMI without HDCP to analog CRT".


The world doesn't care about FOSS drivers.

Anyway, HDMI is for TVs and DisplayPort is for monitors. They're both entrenched enough that it doesn't make much sense to try to cross over.


Ok, but, why not just try to converge on one connector? as you say, the world doesn't care...


Fundamentally different goals.

HDMI world wants DRM because other than Kim Dotcom (who is highly problematic for other reasons) no one has dared to stand up to the MAFIAA goons.

Computing world wants DisplayPort and a hassle-free, high quality experience. DRM breaks that.


The manufacturers very much care - every HDMI port (physical or virtual) involves tithe to the HDMI consortium, whereas DisplayPort does not.


I don't follow - surely if the manufacturers have to pay for every port, they'd like something that isn't costing them money?


Which is why in my experience on computer hw it's common to see one, maybe two HDMI (among other reasons, to support things like connecting a console to your monitor), and multiple display port connectors - sometimes it goes ridiculous - a maximalist approach to counting on my laptop, as currently docked, from the point of "ports implemented in GPU" if not actually available physically... gives me one HDMI and 15 DisplayPort, with the HDMI being routable onto a bunch of options. Of course that's a silly comparison, but depending on how licensing is worded I would be really unsurprised if computer manufacturers optimized for lower HDMI port count.

Also, since older HDMI was physically compatible with DVI ports, a lot of computers preferred to keep DVI ports...


>why most TVs still don't support USB C display port and just support HDMI

Cost. USB c has much more overhead. For example, people expect they’ll be able to charge with it and have usb pass through.


>Is there any reason why most TVs still don't support USB C display port and just support HDMI?

Personally I’m convinced it’s mostly because the display manufacturers want to discourage the use of TVs as monitors, in order to protect their margins on monitors.

8k monitors should be sub 1000 usd by now and standard for anyone working with screens. You can get that as a tv but not as a monitor. :(


CEC is probably one of them, but I'm not sure that's a bad thing.


I don't think DisplayPort has an equivalent to eARC either.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: