Hacker News new | past | comments | ask | show | jobs | submit login
Displayport: A Better Video Interface (hackaday.com)
828 points by zdw on July 11, 2023 | hide | past | favorite | 411 comments



This is a really well written article. If you haven't bothered reading about DisplayPort because you knew VGA, somewhat knew DVI and thought that HDMI is the culmination of it all, where then DisplayPort is just some further kind of evolution, this article does a really good job at explaining how DP is very different and something new, something worth knowing about.

The core sentence which made me actually read the article was "DisplayPort sends its data in packets." and does a good job at explaining what this means and how this differs from HDMI.


As a habitual comments skimmer, thanks for selling me on the article. :)


Do you know what sucks? DisplayLink. It's 2023, and the best we have for 3 monitors (MacBook Pro, and 2 external monitors) is basic screen scraping with software drivers if you want to use a single cable... otherwise you have to fall back to 2 cables (one for power + first monitor, and one HDMI).

The other reason why DisplayLink sucks is performance (or lack of) - what they don't tell you on the box. Video is like 16FPS on my third monitor!


This is only because MacBooks do not support MST. On all of my non MacBook laptops I can use multiple monitors over USB-C.


Macbooks (at least the intel ones) support MST with another OS like Linux.

The missing support is in MacOS.


Ah damn. After your comment and reading around here, looks like it's going to be impossible to support this natively with a single cable.

... there goes my cable feng shui


If you have high end monitors (4K 120Hz or 6K 60Hz), each monitor will use up all the bandwidth so using one cable for multiple monitors is a nonstarter.

IIRC there's a way around this limitation with DSC and thunderbolt docks, but it's not worth the effort.


I've run into problems repeatedly when it came to DP cables. The issue is always the same: lack of enforcement and no barrier to entry allows for too many entrants, whose motives are up to consumers to discern, wasting time and money in the process. What I originally saw as a benefit of an open standard, as it turned out, was in fact a benefit (for consumers like myself) of the closed/high priced standard—which ended up costing me less, in the process.

Something the original article doesn't mention regarding multi-stream (MST) is that it could be used in situations where a defined standard for a certain resolution/refresh rate didn't even exist, to still make it available. For example, the earliest 4K/60Hz monitors relied on DP 1.4's (or 1.2's? I don't remember) ability to address multiple displays, to send two signals—each to cover one half of the total screen area, i.e. 1920x2160/60Hz, for a combined total of 3840x2160/60Hz—to the same display, which that display then used to internally drive two virtual screens, added seamlessly to create 3840x2160/60Hz. At the time (around 2013 and for a while thereafter), the maximum supported in single stream configuration (or by the existing HDMI standard at the time)_was 3840x2160/30Hz.

You'd think this would be a point in favor of DP—which is certainly what I assumed, at the time. Unfortunately it soon became obvious that because there was no enforcement of DP compatibility—of claiming to support up to a certain version of DP fully, in other words—this meant that most cable manufacturers felt no compunction about lying shamelessly, claiming to support e.g. the 1.2 or 1.4 version at the time (which implied supporting its MST and bandwidth capabilities fully), while doing nothing of the sort.

The lies did not stop there, by the way. If I could, I would here post a photo that I would happily take this very moment of such a DP cable—which didn't come especially cheap at the time, btw., in fact it took three days' worth of effort, a lot of handwringing and plain luck in the end (not to mention wasting money on several dud cables, each claimed to be fully compliant, on top of the additional money required) to finally purchase a cable that actually was compliant—which claimed gold plating as one of its features. Some fine gold that was, with black spots on both sides of the yellowish anodized plug, where the metal had oxidized! Why? Because, as I came to realize, the high barrier to entry created by the high licensing fee of the HDMI group also acts to keep away a bunch of unscrupulous manufacturers, which is purely a benefit to consumers!

In addition, I've never had problems with regular size HDMI plugs—in particular, with removing them. I can't say the same when it comes to DP (especially full sized), which frequently (by design?) have a button that needs to be pushed in to release a lock that holds the plug in place. The problem is, too many times it's very difficult, if not impossible, to push down this button. Worse yet is the ambiguity that this creates: is the button fullly depressed? Is it stuck? Am I getting ready to rip out, or at least damage the underlying hardware, or even just the cable itself? These are thoughts I've had nearly every time while trying to unplug a DP cable, while HDMI (at least the standard size) slides out smoothly, as nothing is put in place to hinder this. In addition, this works well time after time, meaning there is no mechanical fatigue like with mini- and mico-USB.

I can appreciate the idea that DP, when used internally (e.g. in laptops, where any shortcoming would directly reflect on the manufacturer of that laptop), or embedded within another standard, is a great idea, where its low barrier to entry /can be/ (as long as the savings are passed on, that is) a benefit to consumers. However, to claim the same in all applications is simply not supported by the real life outcome of either implementation philosophy.


I hate the lock on DP cables since I have managed to break 2 of them when HDMI just slips out in that situation. DVI also has a lock (2 screws) but for some reason their connectors never break, why did DP has degraded that is a very big question for me.


You can buy cables without the lock tabs. They work well and still stay in just fine, making me wonder why the stupid infuriating tabs are even necessary. I replaced all of mine after busting my knuckle open wrestling with a locking cable in a tight space.


Thank you for writing this down. I’ve had very similar experiences with DP and been on the HDMI train since.


Can be used over USB-C apparently, which should alleviate your physical complaints.


> how DP is very different and something new

To be fair, if you think you know your video cables but don't know what display port is, you've been out of the loop for over a decade.


The amazing thing about DVI, and by extension HDMI, is that it's just VGA but digital, with all the timing and synchronisation complexity that implies. Recall that DVI can have both an analog (VGA/DVI-A) and digital (DVI-D) signal in the same cable. They aren't independent; they share some pins and have the same timing. You could have made a CRT monitor that used DVI-D just by adding a DAC, though I'm not sure if anyone ever did.

DisplayPort does away with all that legacy. I assume the hardware to implement it is much simpler and more reliable.


This also has side-channel implications, and is a reason why eDP is recommended in voting applications. https://www.eerstekamer.nl/nonav/overig/20160428/richtlijnen...

>DisplayPort uses a scrambler as part of its line encoding in order to flatten the Fourier spectrum of its emissions and suppress spectral peaks caused by particular image contents. This reduces the chances of any particular image content causing a problem spectral peak during SDIP-27 and EMI spectrum measurements. According to the standard, the scrambler reduces spectral peaks by about 7 dB. As a side effect, the scrambler also makes it far more difficult, probably even impractical, for an attacker to reconstruct any information about the displayed image from the DisplayPort emissions. [..] DisplayPort uses a small number of fixed bit rates, independent of the video mode used. Unlike with most other digital interfaces, video data is transmitted in data packets with header and padding bytes, and not continuously with a television-like timing. As a result, DisplayPort cables are not a common source of van-Eckstyle video emanations and this again will make it very hard for an eavesdropper to synchronize to the transmitted data.

Speaking of which: how does one force HDCP (say, in Linux)? That would transform this technology from a DRM nuisance (strippers easily available on AliExpress) into a Van Eck-countermeasure.


HDCP can't really be stripped, if it's on then something is an HDCP receiver. For v1.4 signals you can probably get something with the leaked key, but not for v2.2 as yet.

HDCP itself needs to be licensed in order to generate it, and that licence is only available to HDMI Adopter companies. However there is code to get some chipsets to turn it on: https://github.com/intel/hdcp

Grab it now before Intel deletes it completely.

HDCP is really a very ugly protocol designed just for anti-copying, I wouldn't build anything relying on it, and everything is harder with HDCP from an AV integration perspective. If you have long links that you want to secure, use something like SDVoE with encryption and authentication (bits are easily flipped in HDCP).


There are devices that will give you HDCP 2.3->1.4 conversion for compatibility reasons (or maybe wink-wink reasons), and after that you can use a HDCP 1.4 stripper. There are also devices that will strip 2.2 outright (they are marketed as HDMI splitters but, oops, one of the two outputs has unencrypted signal, an honest mistake); not sure about 2.3 though. Completely agree that HDCP makes everything harder, but in this scenario it would also make attacker's job of descrambling the eavesdropped signal harder. And this is after they have overcome packetization and spread spectrum hurdles, which the paper suggests is very challenging to do, so would be a (hacky) defense-in-depth for DisplayPort. Even more important for HDMI/DVI which lacks the aforementioned hurdles.


My understanding is that the downside of those downgrade devices is HDCP 1.4 will leave you stuck without 4k and HDR for content that uses HDCP.


Yes you can extract plaintext 1080p but not 4k HDR.


Voting computers are not recommended anyways.


It's still shocking to me that they're used widely, and also pains me that my 15+ years of railing against digital voting computers is now looked upon the same way as the hard-right-wing's complaints (made up ones, but in a semi-related space to my complaints) about said computers.


It should be a national citizen identity logon system, which a voting web site uses, everyone can vote from home or from a laptop provided by the independent voting facility.


Sounds nice, but computer based voting systems, no matter how secure, clean or simple, are fundamentally opaque to the average citizen. That alone undermines the very democracy it tries to support. Paper based voting systems may be bloody tedious but they’re simple, and anyone can participate.

As a programmer myself, I have to say no to electronic voting systems. We need to keep it low tech.


Also, they may be opaque even to software engineers, it's not like we have magic powers to look at a physical computer and know what code is running.

Even if the code is released:

(1) Who can read and understand tens or hundreds of thousands of lines of probably not very comprehensible code? Not many people can and even fewer will.

(2) Is the code running on the machines the same as the published code?

I don't even know what code is running on my phone or laptop right now.


Even if you work in the space as a software developer, the shear amount of variance in government voting systems from registration, petition verification/validation, and voting itself is really wide. Some states have rules that vary from county to county. Some of these states have lots of counties. Election systems are some of the most byzantine you can imagine.

Disclosure: I used to work for an Election Services company.


The process needs to be understandable to non-programmers. Right now, even those who program it don't know every instruction/interaction/oddity from the processor running the code.

I can explain a voting process with pen and paper to a child.


When you vote from home I can show up and put a gun to your kid's head and tell you to vote for Pedro.

When you vote with some sort of association with your name I can look up how you voted after the fact and if you didn't vote for Pedro I can blow up your house.

The fact is that the secret, anonymous ballot is the only valid way of voting.


> When you vote from home I can show up and put a gun to your kid's head and tell you to vote for Pedro

If you could do that at a scale large enough to affect the result, I would think you could also do that at each ballot box and make the vote counters report the results you want them to report.

Also, if you could do that at scale, why would you bother with elections at all?

IMO, the main problem with digital voting is that it makes it possible to change the vote count without needing lots of people to do so.

With thousands of ballot boxes, open counting of votes for each ballot box and the publication of election results per ballot box that’s basically impossible.


In the 2020 election one of the seats for the House of Representatives had an initial margin of victory of six votes. After recounting it was something like twenty. And another had a final margin of victory around one hundred.


I doubt anybody could tell it was going to get that close. The uncertainty likely ranked into the thousands, if not tens of thousands of votes.

I think few fraudsters will be happy with slightly increasing their odds by swinging the vote by a few hundred votes. Instead, they’ll want almost certainty, if not complete certainty.


Same reason I advocate for ranked choice voting instead of other even better methods of choosing a winning candidate. Some of them require matrix multiplication.

(I still support electronic voting though).


If a voter is too inept to properly operate cryptography, can it be said that their vote is legitimate? Seems to me that not everyone should vote.


Yeah, I’ve heard variations of that argument spoken non-ironically. Opens a whole can of worm. I’d rather assume, barring special circumstances, that anyone old enough is able to do politics (voting, being elected, being randomly drafted in a commission or jury…). With the right mechanisms groups are smarter than we give them credit for.


This "can of worms" is why democracy will fail and a Caesar will eventually ride into town and promise all kinds of things to the median voter. If it is considered immoral to hold voters to any standards, why would it be immoral for politicians to lie, cheat and steal?


No.


You mean election type voting?

Screen scraping is the least of the problems.

Voting machines are simply not trustworthy given these requirements for a free and fair election:

0. Freedom to vote or not

1. Anonymity in that no vote may be associated with the voter who cast it

(These two are most of why e-voting is not trustworthy and are what differentiates banking from voting.)

2. Transparency. The law, means, methods must all be known to the voting public. It is their election. In particular, seeing how a cast vote moves through the entire process and into the final tally is necessary.

3. Oversight. The first two of these foundation requirements necessary for free and fair public elections have an important ratification:

The record of votes cast must be human readable and said record must be used directly to realize the final tally of all votes cast.

There is a chain of trust between voter intent and the vote cast. When one casts a vote on physical media, that is a direct record of voter intent. The voter can be sure their intent adds to the tally because they can directly verify their votes cast.

Electronic votes are a vote by proxy. The direct expression of intent is not kept and exists as a bit of skin residue left on the machine interface. Put simply, we ask people to vote and then we discard their expression!

What gets used instead is whatever the machine thought the voter intent was!

Consider this: You have a machine in front of you with two buttons. One brings the world into harmony, the other casts it into war, grief and darkness.

You can:

A. Submit a paper form you fill out manually indicating your choice

, or

B. Press one of the buttons and get a receipt for your records.

Say this machine has an indicator showing your choice and a third "approval" button you use when you feel good about the machine getting your choice right.

How do you the voter know the receipt and or indicators match what the proxy machine says your choice was.

What happens when we have a dispute? We know the machine can't be trusted to match receipts. With the direct expression of intent lost, what do we bring to court?

We can and have brought paper ballots into court to resolve and election facing genuine ambiguity.

I submit to you we can't really be sure who wins an electronic election, unless we also associate your identity with the vote.

Say a bunch of people get to choose the fate of the world this way. What is your confidence level?

How do we know the machines tallied up the winning choices correctly?

What do the tests for that look like and how do they get around the forced trust problem we have with all electronic inputs today?

There is a lot more trouble to discuss. What I put here is the core trouble and computers have always had this problem too.

It just does not present any real difficulty, until elections are brought into the discussion.

Display trust issues are well above the core tech trust issues we face today.


I was guessing that the oddly beautiful 17” Apple CRT Studio Display [1] from 2000 might have been a CRT that used a digital signal cable because it had the short-lived Apple Display Connector, but apparently ADC carried analog too.

[1] https://everymac.com/monitors/apple/studio_cinema/specs/appl...


The ADC carried power, too, which is why Apple went with it.


One (extreeeemely pedantic) difference is that (I think) HDMI begins and ends its vsync pulses when the preceding hblank begins, whereas VGA begins and ends them slightly later when the preceding hsync pulse begins (I documented this at https://nyanpasu64.gitlab.io/blog/crt-modeline-cvt-interlaci...).

I also think that DP rouuuughly reflects CRT timings complete with active vs. blanking intervals, but doesn't actually have a fixed pixel clock (perhaps it does? I didn't quite figure out synchronous/asynchronous pixel transmission when reading the spec) and doesn't transmit hsync pulses once per scanline.


> You could have made a CRT monitor that used DVI-D just by adding a DAC, though I'm not sure if anyone ever did.

From memory IBM offered a crt with DVI.


Plenty did but they were usually DVI-A or DVI-I


>DisplayPort does away with all that legacy.

You wish. DP sends exactly same bytes DVI does (blanking and all), just broken up into packets.


The important difference is that (like VGA) DVI and HDMI still dedicate particular physical wires to separate red, green, and blue channels. Display port does not shard individual pixels across multiple channels: all bits (all colours) of a pixel are sent on the same diffpair. If a DP link has multiple lanes then the bits of different successive pixels are sent on each lane.


This is great, but then they drop the ball by forcing garbage (“interspersed stuffing symbols”) between the packets instead of letting you use that BW.


You are not giving the full picture in this trade. The people making the spec knew the trade well. You add a lot of complexity to a receiver by forcing it to generate its own timing information from whatever video signal is sent. That work has to be done somewhere and it is most robust when done at the source.


That optional infrastructure on the receiving end is already provisioned in the standard in form of eDP PSR (self refresh).


Panel Self-Refesh is an optional feature, is it not?


Why isnt there an optional matching DP mode where you could just transfer screen data without timing? Why are you forced to sent fake blanking to PSR displays?


I think that you can actually transmit audio, or other displays' signals (DisplayPort Multi-Stream Transport), in place of stuffing symbols.


Afaik its a no-no zone. You can only use blanking periods.


You may be correct (and I was mistaken in my previous post) for audio. The leaked (DP isn't really an open standard) DP-1.2.pdf on glenwing's site says that "The dummy stuffing data symbols during the video blanking periods (both vertical and horizontal) may be substituted either with main stream attributes data or a secondary-data packet", and the public 1.1a PDF indicates similarly.

MST is quite complicated and appears to interleave bytes from different video streams within 64-byte packets (rather than solely during blanking):

> The MTP (Multi-stream Transport Packet) is 64 link-symbol (1 byte of video data or special symbols) cycles (that is, 64 time slots) long, starting with MTP Header in the first time slot (or Time Slot 0), and is constantly transported regardless of the presence/absence of streams.

> The Payload Bandwidth Manager of each uPacket TX in the path from a DP Source to a target Sink device allocates time slots within the MTP to a VC (Virtual Channel) Payload to establish the virtual channel for transporting a stream.


> However, I’d like to tell you that you probably should pay more attention to DisplayPort – it’s an interface powerful in a way that we haven’t seen before.

Lines like that really show just how long it can take for standards to get on the radar of mainstream tech culture. I remember hearing about and being excited about DisplayPort's move to packetized digital video in college in 2008, and seeing the first Macs with MiniDisplay port later that year (or perhaps it was in 2009)!

I was actually under the impression that it has been well-known and commonplace for hobbyist and enthusiast PCs for well over 10 years, but I'm probably wrong about that!


> impression that it has been well-known and commonplace for hobbyist and enthusiast PCs for well over 10 years

It is. For a long time DP was the only standard that could do variable refresh rate. Even today all high end monitors have DP while the cheapest monitors only have HDMI.


Which is ironic considering the cost of the HDMI port is likely higher than the DP one due to licensing!


HDMI is table stakes for most people. DisplayPort is not.

So it’s not that HDMI cost being higher but HDMI+DP being higher.


I just wish TV manufacturers would start putting Displayport into their TVs.

GPUs nowadays use 3x Displayport and 1x HDMI, which is quite the bottleneck if you want to max out your ports.

As I understand, HDMI <-> Displayport converter cables often do not have the high end features you might want, such as 4k/120Hz HDR, and/or Variable Refresh Rate. Perhaps this has improved in recent months.


Same. I live in the Mac world. When I took a job where I had to use PCs at work I was surprised to see they were mostly using HDMI with some people using older DVI equipment.

I had just assumed everyone had transitioned similar to Macs has years before.

Nope. Lots of people use/want HDMI to this day.


> Lots of people use/want HDMI to this day.

I’d wager this is less about HDMI, and more about the fact that people want their old DP-less monitors to work.


Display port to HDMI cables exist.


Correct, and the article itself stated the conversion (DP source to HDMI sink) is very easy. Still, I can see laptops choosing the most popular port regardless of the possible conversion. Anecdotally when I go to work the cables I see are mostly HDMI. (This is where I’d love to have a Framework-like laptop, where I could just swap between HDMI and DP output with the relevant output module).


> Still, I can see laptops choosing the most popular port regardless of the possible conversion.

Yeah. It's silly that my old Lenovo ThinkPad X220 (bought in 2012) has a full-size DisplayPort port whereas my Lenovo ThinkPad X1 Carbon Gen 7 (bought in 2019) has a full-size HDMI port. It's like it's regressing.


That, and a similar line might've been said about FireWire, which didn't really make it.


Maybe I agree with that, but I also know that firewire helped usher in the digital video era. It allowed the transition from tape based acquisition when media cards were prohibitively expensive. Audio/Video/Deck control all down one single cable straight from the camera to the computer was what really kicked the prosumer market into being able to lean closer to pro than consumer. Now that media cards are actually affordable, that does seem like ancient history. I could see how you might think of firewire as a failure if you're looking at it as a USB type transition, but for the camera/video professions, it served a very good purpose even if short lived


I once purchased a FireWire multi-channel audio interface with the express intent of reverse engineering the protocol in order to add Linux support. I purchased a specialized PCI card that was intentionally non-standard (not OHCI 1394) but could snoop and capture traffic between a device and a host. I chipped away at deciphering a register map for awhile, but eventually better USB audio devices came along that had more driver support.

Still trying to do the same sort of thing though with some audio related USB and Ethernet/IP devices so I guess I never really gave up the idea in my heart!


Do the USB devices have comparable round trip latency? That is where USB really hurts and where the Thunderbolt audio interfaces seem to fond their niche today.

I'm looking forward to audio interfaces that can do USB4 wrapped PCIe, but for now I live with the latency on Linux.


RTL latency doesn't seem that bad on chips that are built for the purpose, like RME's. Just the off the shelf stuff might be less than ideal.

If anything I think a problem is there's just overall too much slop in the whole chain, including the operating system, motherboard, whatever else. It's a little ridiculous that a $50 guitar pedal can easily get better RTL numbers than a $1500 computer system.


Linux has preempt_rt pretty conveniently accessible, and Pipewire has some useful tuning knobs, but my Steinberg UR44 can't seem to keep up.


Are you already aware of the PipeWire batch flags which add additional latency to USB audio devices (dependent on ALSA period rather than quantum)? https://pipewire.pages.freedesktop.org/wireplumber/configura...


Yes, I tried setting those lower but ended up with xruns and other issues. :(


Did you decrease the USB ALSA period to the lowest practical value (eg. 128 or so)? In theory this provides finer-grained position updates and better stability at any given latency, but I'm not sure how it interacts with USB's polling rate.


Yes, the latency increases, the audio quality worsens, and I see the following messages in Pipewire's logs:

[W][30498.719355] spa.alsa | [ alsa-pcm.c: 2478 spa_alsa_read()] steinberg_ur44_mono_in:UR44,0,2: follower delay:4042 target:4832 thr:256, resync (374 missed)


Ouch that's unfortunate :(


you are a glutton for punishment. a hacker after my own heart!

i have so many unfinished things where i just knew something could be figured out, only to not succeed. but it sure is fun trying on top of learning new things as well.


It wasn't that short-lived. All DV cameras had FireWire. Although Sony, per its mania for undermining industry standards, created a bastardized version (called I-Link, ugh) whose connector lacked power and required a physical adapter.

External FireWire drives (like LaCie's) were popular for quite a long time, since they required no extra power source.


I think the 4 pin version was also standard, it's just that Sony didn't own the trademark on FireWire, Apple did. Sony was actually one of the companies that developed the spec, according to Wikipedia.

I had a HP laptop with that connector, but it was labelled IEEE-<numbers>. Bought a 4 to 6 cable to connect it to an old iMac, but never bothered enough to use it.


It was a standard because Sony said “we’re doing this” and gave the FireWire group an ultimatum: standardize it or were not on board.


I think it was more that Sony couldn't call its bullshit plug FireWire because it lacked a required FireWire feature.


The feature was the power pin. You couldn’t power a device from that tiny plug (on either side).

I do understand why they did it. Sony is the master of miniaturization and wanted things to be small as possible. So the idea of making one of their little tiny handheld camcorders noticeably bigger just for a single connector was probably a non-starter for them.

So they made up their own and then used their size to force it to become a part of the standard.


Yes, I noted that: "whose connector lacked power and required a physical adapter."


Sony also gave us a professional studio DVCam deck that was SDI based that allowed you to connect to of them to get 4x dubbing speeds. Not once did it ever get used at the studio I was working. Nothing else in the shop used that format, so it was always just realtime SDI work. ahh, sony


Well, if we're going to review Sony's litany of standards-undermining horseshit, we'd be remiss to omit S/PDIF, which was a bastardization of the existing AES/EBU digital standard. Apparently it differs just enough to make interconnectivity unreliable... creating a pain in the ass for studios the world over.

Sony also briefly attempted to undermine Thunderbolt, by issuing a laptop that crammed Thunderbolt connections into a regular USB-A port.

I'm sure I'm forgetting (or unaware of) several others.

Then there was their ridiculous clinging to MemoryStick for what, a decade after the rest of the world had standardized on SD?


> laptop that crammed Thunderbolt connections into a regular USB-A port

wtf, link? I'm really curious how they did that in a backwards compatible way


Speaking of Sony video, they also have Gigabit Video Interface which uses some clever trickery to transmit clean signal in difficult environments like vehicles

https://web.archive.org/web/20210513082248/https://www.sony-...


i closed the tab before looking for a date from when that was released. however, we've been converting video signals to use over Cat5/6 for a really long time since it supports running over much longer distances. can't believe, well, yeah, actually with Sony I can, that they went and made a whole format


Yeah, FireWire was a necessity at the time for certain use cases. Even basic consumer digital camcorders required FW400 to pull onto a PC.


I maybe mis-remembering, but was it possible to add FireWire to an existing PC with an expansion card? I know Thunderbolt was not possible from 3rd party vendors and only the mobo manufacturers could offer a card. I bought one to make a Hackintosh, but then a mobo firmware disabled the card because they didn't want to support it. I seem to recall Firewire being the same way


Yes it was. Still is. I almost bought a FireWire card so I could use an iPod with Windows before the USB model came out. I also remember an old Dell laptop I had having a FW port.

When FW400 came out we were on USB1 which was stuck at 12mbps. It wasn’t designed for hard drives and such. It was for floppies, keyboards, mice, printers, barcode scanners, etc. Low bandwidth. Not that that stopped manufacturers.

FireWire just wasn’t popular on the PC side outside of video editing (and perhaps other specialized uses).

USB got far more useful with USB2, which went to 480mbps, but IIRC you couldn’t achieve that and it had tons of CPU overhead. FW could max out its bandwidth and had low CPU overhead by design despite having a lower theoretical max.

But it was over. USB2 was cheaper and FAR FAR more common. FW800 existed for those who needed more bandwidth. But for most people USB2 was plenty and they already had it.

Thunderbolt took over for FW, and as of USB4 the two are basically the same (relative to FW vs USB’s differences).


Yes, but with limited support. Macs were the only practical option for home video editing back then. iMac DV was released in 1999 with FW400, and iMovie was released at the same time. Media editing seemed like the central use case of Macs through the 2000s.


yes, still is.


FireWire has multiple rates available in the FW400 cable. Consumer digital video camcorders are at 25Mbps for video + 1.5Mb for the audio and use the S100 100Mbit link speed.

Other HD formats like Panasonic DVCPro HD went up to 100Mb video running the same tapes faster.


Apparently it's 1394b is used in some military applications, the F-35 for example.


Are there any KVM switches that do Displayport well (i.e. where switching between inputs does not look like a display disconnect to the PC)?

I'm still using HDMI because I like to share my home multi-monitor setup between my personal machine and my work laptop, and the KVM switches are able to fool the PCs into thinking the monitor are always connected. Years ago I tried a Displayport switch, but it could not -- I assume because if the greater sophistication of the Displayport protocol.


The magic words you're looking for are "EDID emulation". The KVM will continue to send the EDID data from the monitor even after you've switched away, which will fix that issue.

It's relatively uncommon and not always implemented super well, but it's a requirement for any DP KVM to be not super annoying IMO.

There was one particular KVM brand that was supposed to do it well whose name is escaping me now :/. I was looking at buying one in ~ May 2020 for obvious reasons, but they were on super-backorder (also for obvious reasons), so I never got around to it. IIRC they were about $500 for a 4 input/2 output version, so not cheap.



I'm 90% sure it was, but it looks like they did a major UI update to the site so it's not triggering the "that was it!" lightbulb in my brain.


I use one, so I can confirm they work extremely well, subject to some caveats:

* The total cable length is important, both between the host / KVM and the KVM / monitor, as well as any daisy chained displays you have. I had to use certified cables to get everything working reliably with my setup.

* There's a weird interaction with BIOS power on. The boot display drivers I have freak out if they aren't the active display and fail. I solve this by switching the KVM before I turn the computer on. After everything is booted into an OS, it works fine to switch.

* Power supply quality is important. I had some issues before I made sure the power supply was reliable.

KVM switches are just inherently difficult little devices. I haven't had issues since I got it working though.


>There's a weird interaction with BIOS power on. The boot display drivers I have freak out if they aren't the active display and fail. I solve this by switching the KVM before I turn the computer on. After everything is booted into an OS, it works fine to switch.

Do you have an AMD GPU by any chance? I have the level1tech 2-head DP 1.4 KVM, with an AMD RX 560 on a Linux host, and after updating to kernel 6.4 recently my computer now boots fine without a monitor attached.

I had a similar issue where a display had to be _on_ and _connected_ (i.e: active on the KVM) at boot time, or the GPU wouldn't work at all. I could get in via SSH, so I tried various amdgpu recovery options, poking the device to reset it, reloading the kernel modules, etc., and never had any luck. I just lived with the quirk. It was problematic because if you left home with the KVM selected on the Windows guest, and needed to reboot the Linux host remotely, you'd come home to a non-functional Linux desktop.


I have a similar issue with a Nvidia 1080Ti on my old desktop. It's related to the UEFI deciding if the iGPU or the Nvidia GPU should be primary and to disable the iGPU or if the iGPU should stay enabled.


Curious how you went about determining that was the source of the issue.


I'd swap the cables around and could see video from the iGPU output.


If this is the problem, you can usually force the GPU choice one way or another in BIOS.


Nvidia on both, one consumer and one workstation. Neither CPU has built-in graphics iirc, nor do the motherboards expose the ports.


Alternatively, you may have been thinking about ConnectPro. I ordered a kvm from them around the same timeframe and it was delayed quite a bit from backorder. (Though, they also did a major UI change, so might not be able to tell either).


Two thumbs down for connect pro. I ordered their top of the line 4 computer, 2 monitor DisplayPort KVM and it took months to arrive. I could not cycle between inputs using the buttons. They were more like a suggestion to use that signal path; I would constantly need to power cycle the kvm, monitors, or both.

I ended up ditching it on eBay at a significant loss for a $30 usb switch and just switch monitor inputs manually. Far cheaper solution and way less fussy.


I had the same issue with this device. I ended up writing some code that you could run on a machine to operate the switching via the RS232 port: https://github.com/timgws/kvm-switch/

Bonus for adding 'glide and switch' functionality, so you can move the mouse to the edge of the screen and it would jump the input to the next display in your layout. It's like a hardware version of Synergy.

Very finicky device, but if you don't touch it - and you don't use any of the shortcuts - it works.


Neat! I used to use ShareMouse (pay-ware, if you want more than two machines tied together) for this, because setting it up and keeping it working are so easy compared to whatever version of Synergy existed at the time.

Synergy made me manually configure my monitors by dragging little boxes around in a window, would frequently refuse to connect, would spontaneously disconnect, and repeatedly mangled my config such that I had to keep manually configuring it over again.

With ShareMouse, it was "open a copy of it on each machine, slide mouse in direction of next machine, then optionally enable encryption (to prevent other users' instances of ShareMouse from being able to attach)".


I should also add that their customer support was totally worthless. They promised me a firmware update, and stopped responding after I confirmed my firmware version - the process to get that value was already quite arcane, so ultimately I felt like they never really had any intention of helping and were simply stalling me out.


Adderview made some of the best KVMs, but I don’t know if they have a good DisplayPort model or not.


BliKVM PCIe I bought (based on PiKVM) came with an EDID emulator.


Too bad it doesn't support higher resolutions.


None of them are perfect, but I've heard good things about the DP switch from Level1techs. The thing is, all of them are a little tricky, but they mostly differ in how quirky they are, and I suspect the reason why people like the Level1techs DP switch is that they seem to at least try to alleviate some of the issues DP switches tend to get into.

https://store.level1techs.com/products/14-kvm-switch-dual-mo...

The startech one I have is alright... But Apple computers absolutely hate it and frequently refuse to display, and sometimes Windows gets stuck and USB devices stop working. Strangely enough... Linux doesn't ever have any problems with either display or input. A rare win, but fine by me.


The Level1Techs KVM switches are rebranded Startech switches with 'dumber' firmwares whose dumbness affords better compatibility with niche DisplayPort features.

I have a bunch of them and I like them pretty well, but getting a bunch of computers all plugged in turns out to be a bit of a nightmare, especially when you need some long-ish cable runs or you are daisy-chaining devices (e.g., multiple KVM switches, adding USB hubs or Thunderbolt docks, etc.).

The Level1Techs KVM switches don't meet GP's criterion for hotplugging behavior, unfortunately. Switching between devices is just an unplug and replug for them.

Like you, I've found that macOS and Windows don't handle hotplugging monitors well, but Linux desktops (in my case, KDE Plasma) consistently do the right thing and don't require a repeater to lie to them about monitors always being plugged in.

FWIW, I don't get the 'Apple computers just refuse to work' issue with any of my L1T KVMs.


> The Level1Techs KVM switches are rebranded Startech switches with 'dumber' firmwares whose dumbness affords better compatibility with niche DisplayPort features.

Rextron [1] is the actual ODM. They don't do any direct to consumer sales, though. That's why L1 / Startech / other "brands" sell them on amazon and the like.

Last I spoke with the L1 guy, they were still having some issues with the high speed USB-C switching chips on the 1x4 "all USB-C" model that he's got a wait-list for.

[1] https://www.rextron.com/KVM-Switches.html


Ah! Great info. Thanks :)


> FWIW, I don't get the 'Apple computers just refuse to work' issue with any of my L1T KVMs.

My work intel macbook worked great on it for like a year once I got a high quality usb-c -> displayport cable, but an os update borked it... though it's definitely the mac that's the problem as it also has problems sometimes with just strait to the monitor too (on the other hand 49" ultrawide is pushing the bandwidth near it's limits).

My personal arm macbook has always worked great on it with even a crappy usb-c -> displayport

my windows desktop also always worked great on it.


> on the other hand 49" ultrawide is pushing the bandwidth near it's limits

One of the things I've learned too late is that using DisplayPort (and maybe HDMI, idk) anywhere near its bandwidth limits is not worth it for me. Having to think about cable run lengths as well as cable quality and peripheral quirks and internal Thunderbolt dock bandwidth allocation and so on and so on just fucking sucks.

It'll probably take until the next/latest (2.1) generation of DisplayPort propagates before using multiple monitors with specs similar to my current ones (high refresh rate and HDR, but not even HiDPI) isn't painful, cumbersome, and finicky.

I probably won't be able to use them by then anyway. Ugh.


> The Level1Techs KVM switches are rebranded Startech switches.

Another commenter posted that the L1Techs KVM are Rextron devices. The Startech switches are rebranded ConnectPro KVMs.


Huh. I just matched them based on visual similarity ages ago and was apparently very wrong. I'm sorry, and I wish I could edit my earlier comment. :(


I had much more serious problems with a StarTech DP KVM and Macs. My Macbook would hang and crash-reboot. Both on the initial plug-in and on switching inputs.

Everything else seemed to handle it fine with Linux being especially unphased, as usual.


https://store.level1techs.com/?category=Hardware

This is what I use. It appears to disconnect, but also doesnt seem to be an issue. My machines re-organize instantly.


I got their 10gbps displayport switch to use with switching a single monitor between a Windows desktop PC and an M1 MacBook Pro. I have a 4k@144hz monitor and can get the full framerate and resolution with this setup. I've never had any problems, would highly recommend.


Nice, I'll check these out. I went with an HDMI KVM and am worried about updates to HDMI making it obsolete.


It's gotten better in the last year or so with Windows 10 but it'll still sometimes just fall apart when the display configuration changes, which is something that just never happened for any reason with HDMI/DVI.


I was looking at this as an upgrade pick and don't have any re-arrangement with my TESmart (TES-HDK0402A1U-USBK). What monitor(s) do you have?


The new Dell 6K has kvm (and PiP) functionality across its inputs, and it does appear from my modest use of this feature so far, that it works as you would want (ie it still thinks the display is connected, even when not showing that input)


I would prefer that with 1 display but I have 2 Dell 6Ks and it's kind of annoying if I want to have them each on a different PC. (I use a usb switch to switch my peripherals between displays)


Can you explain why this is beneficial? I have a mac laptop and pc desktop at home that i switch between depending on whatever I need to do. By triggering a disconnect, it means all my mac windows that are on the main monitor will zip back over to the laptop so they're still reachable if i need to access them with the trackpad and integrated keyboard. When I switch the kvm back to mac all those windows jump back to the main monitor.


Flaky drivers. KVM induced unresponsiveness is pretty much the only reason I ever have to hardboot my computers.

Also, even if the drivers are solid, they take longer to renegotiate with a monitor that was removed and plugged back in compared to one they think was always there, which matters if you switch back and forth frequently.

Lastly, sometimes the OS doesn't put things back they way they were when you plug a monitor back in. If you have a laptop which has a lower resolution display than the external monitor, you'll often return to find all the windows shrunk to fit the laptop display. Not an issue if you run everything full-screen, but annoying if you tile windows.


I had the startech one the siblings have mentioned but that wasn't very good and didn't do EDID emulation correctly. This CKL one [0] has been working really well, and supports USB 3 which is a nice bonus so I can share my webcam. Though sometimes after wake up my macbook forgets about my second monitor (I have an M1 connected to a cable matters thunderbolt dock), my windows machine which has direct DP connections doesn't have the same issue.

0: https://www.amazon.com/gp/product/B09STVW821/


Never mind KVM switches, I wish powering off my DP monitor while leaving wall AC power plugged in didn't appear as a display disconnect to the computer.


I don't think I've had a KVM switch work well since VGA+PS/2

They all try to be too smart.

As a matter of fact, I usually use a monitor switch of some sort, then use mechanical USB switches - one for keyboard, one for mouse. That seems to be the only way to get mouse and keyboard to work well (basically just a hardware connection, no smarts)


Belkin makes some, up to duplex 4k@60hz, but holy mother of god are they expensive.


Look how expensive are the chips they are using.

High-speed, high-bandwidth, low-delay interfaces are apparently hard.


I'm annoyed nvidia for putting the current generation of HDMI on their recent GPUs, but leaving them with an outdated version of DisplayPort.

For a long time, my advice to anyone was to always choose DisplayPort whenever it was an option. But now that has to have the caveat of "if you have a new high-end GPU and a fancy high refresh rate monitor, HDMI might actually be better for your situation"


> I'm annoyed nvidia for putting the current generation of HDMI on their recent GPUs, but leaving them with an outdated version of DisplayPort.

That was due to unfortunate timing where HDMI had the specifications ready before DisplayPort did.


AMD’s RX 7000 support Display port 2.0 and were released 2 months later then Nvidia’s Ada. Afaik DP 2.0 has been finished in 2019(!).


I thought it was because their G-Sync modules had not been updated to support the new DP? That was what I heard.


Yeah I hate this too because I just want to have more video outputs. My Valve Index doesn't like to be hotswapped plugged in, requiring a reboot. With 1440p144hz monitors, I just barely cannot run 2 of them (2x14Gbit) over a single DP1.4 (26Gbit) using MST. Windows will automatically drop colour down to 4:2:2 if I try it.

Not that DP2.0 MST hubs exist yet afaik, but when they do I'd have to get a new GPU. Which I guess is Nvidia's goal.


I think it's all baked into NVIDIA's product strategy of trying to encourage enthusiasts to upgrade every generation. Yes some enthusiasts will upgrade every generation no matter what strategy you implement, but I would say most enthusiasts philosophy is to buy premium products and skip a generation or two. It used to be just raw computational performance that sold GPUs, but now it's all these additional layers and features, GSYNC, RTX, DLSS, Frame Interpolation, and even the ability to interface with a particular display (e.g. 8k/120Hz) are all part of the product, and so can be reserved to "boost" particular generations desirability. I wouldn't be surprised if there aren't any "software" level enhancements to the 5000 series, just their standard performance uplift, an increase in VRAM capacity (5090 topping out at 32GB), and the new Displayport standard, all to promote and focus on 8K/120Hz and 4k/240Hz gaming. They set the stage for it with the motion interpolation tech. in the 4000 series.


I also noticed that many boards are 3xHDMI+1xDP now. My previous card was 3DP 1HDMI


What card is 3xHDMI + 1xDP? I just checked and at least the reference rtx 4080 is still 3xDP 1xHDMI


The Gigabyte 3070 I have certainly is.


And only 1 new version HDMI port but 3 old version DP ports. I use a dual display setup but HDR only works on mu displays with HDMI.


The more expensive Gigabyte RTX 3080+ cards had 3x DP 1.4, 2x HDMI 2.1, 1x HDMI 2.0, which was great even if you could only use 4 of them at the same time due to Nvidia limitations. Unfortunately for the 40 series they stopped offering it and now they're the same as stock Nvidia with 3x DP 1.4, 1x HDMI 2.1.

Luckily ASUS still offers a bit more with their 4080+ cards - 2x HDMI 2.1, 3x DP 1.4. I personally depend on the 2x HDMI 2.1 to even be able to run my 4K 144Hz monitors at full speed.


Yeah, I only want a 4090 myself.


> Just like most digital interfaces nowadays, DisplayPort sends its data in packets. This might sound like a reasonable expectation, but none of the other popular video-carrying interfaces use packets in a traditional sense – VGA, DVI, HDMI and laptop panel LVDS all work with a a stream of pixels at a certain clock rate.

Funny, a stream of data at a constant rate makes much more sense to me intuitively than packets, specifically for uncompressed video.

Are there any downsides to packetization, like increased latency or dropped frames or anything? Or not really, is it all upsides in being able to trivially combine multiple data streams or integrate easily into hubs?


> Funny, a stream of data at a constant rate makes much more sense to me intuitively than packets, specifically for uncompressed video.

Sure, until you start trying to design the transceivers and realize that supporting two or three fixed standard data rates is a lot simpler than supporting a continuously-variable clock speed. Every other high-speed digital interface operates at just a few discrete speeds: SATA/SAS, PCIe, Ethernet, USB.

The fact that DVI and HDMI were such a shallow digitization of VGA's racing-the-beam meant features like variable refresh rate (Gsync/Freesync) showed up far later than they should have. If we hadn't wasted a decade using CRT timings (and slight modifications thereof) to drive LCDs over digital links, it would have been more obvious that the link between GPU and display should be negotiated to the fastest data rate supported by both endpoints rather than the lowest data rate sufficient to deliver the pixels.


> ... If we hadn't wasted a decade using CRT timings (and slight modifications thereof) to drive LCDs over digital links ...

Don't forget the decade or so (late 1990's to early 2000's) where we were driving LCDs over analog links (VGA connectors).

I had purchased a pair of Silicon Graphics 1600SW monitors back in the day, which required a custom Number Nine graphics card with an OpenLDI display interface. It was many years since those were introduced to the market that DVI finally started becoming commonplace on PC graphics cards.

Using the mass-market LCD monitors in the late 1990's was a frustrating affair, where you had to manually adjust the timing synchronization of the analog signal.


> Funny, a stream of data at a constant rate makes much more sense to me intuitively than packets, specifically for uncompressed video.

There is a data rate floor for how fast the device that outputs must meet. We’ve surpassed this (due to optimization at the silicon level, designing hardware who’s sole job it is to send bursts of data) and we end up running out of data in the buffer periodically because it’s just so fast. Because analog is real time, you can squeeze much else in that data stream but with digital, we are afforded the luxury of packet switching instead of that line being idle, we can pump even more down the linen.

> Are there any downsides to packetization, like increased latency or dropped frames or anything? Or not really, is it all upsides in being able to trivially combine multiple data streams or integrate easily into hubs?

If I recall correctly, the timing and data rates is all prearranged based on reported capacity and abilities of the receiving device and it won’t even attempt to support multiple streams if it is incapable of doing so or the data channel established cannot fully support the bandwidth required.


Latency is limited to the amount of buffering that's present in the interface logic (on sides). In the case of regular DP, there's just a few FIFOs and pipeline stages, so the latency is measured in nanoseconds.

When using display stream compression (DSC), there's a buffer of 1 line, and a few FIFOs to handle rate control. At the resolution for which DSC is used (say, 4K/144Hz), the time to transmit a single line is around 3us. So that's the maximum additional latency you can expect.


If you didn't packetize then the receiver would never recover line coding state when a single symbol is missed. The rest are incidental details.


This post was informative and I didn't realize just how different DisplayPort is from HDMI. Recently, I got a desktop dock that uses DisplayPort instead of HDMI to connect to my monitor. My monitor has 2 HDMI ports, 1 Type-C, and 1 DisplayPort. So far things have been fine but I did notice that the audio is choppy no matter what I do. I thought it was the dock but audio going from my computer > dock > my webcam's speaker works fine (all over usb-c). So unfortunately, it leads me to believe that the DisplayPort is causing this jittery audio.


It might be your dock.

Check to see whether it's USB or Thunderbolt. Thunderbolt docks are more expensive, but considerably more efficient and faster than USB (assuming your laptop/device supports Thunderbolt).

Thunderbolt docks are basically PCIe extension devices, whereas USB docks attach everything as USB, with all the common challenges USB has on systems (like dropped audio when the CPU is busy).


Thanks for the info. The dock is the CalDigit TS3 Plus dock and I’m using it with an LG monitor. Their page says it’s a Thunderbolt dock so I wonder if there’s anything else about this particular dock that could be causing this issue. Btw when the monitor was connected over HDMI, it was fine playing audio.

https://www.caldigit.com/ts3-plus/


That is absolutely TB and was one of the recommended docks in the Mac world for a long time.

Have you talked to support? Or I wonder if it’s an issue on the LG side.


That's pretty interesting. Might be worth trying a new displayport cable, or checking whether there's a firmware update for either the dock or monitor.


No one would be able to answer this for you in your house with your cables with your speakers. Get another one and test it.


My screen only has HDMI and my desktop only has DP out, so I bought a $2 adapter from Temu. The audio surprisingly works fine, I thought that would be a total oversight.

Picture on the other hand is slightly janky, which seems to be a common issue for DP->HDMI convertors. If anyone knows a convertor that doesn't turd up the signal I'd love to know.


Same here with an lg monitor on dp


I didn’t realize Multi-Stream Transport (MST) requires OS support, and I was surprised to find out MacOS, with its great Thunderbolt support, does not support this. “Even” ChromeOS can do MST.


Technically macOS does support MST. But it only supports it to stitch together for a single display. It does not support daisy chaining two displays.

Thankfully, every Mac for the last 7 years has Thunderbolt3 at least, so getting dual-4K-display from a single port/cable is still very doable, you just need a TB3 to dual DisplayPort or HDMI adapter.


Well except for some of the Apple Silicon machines. The M1 (and maybe M2?) only have two video output blocks, of which one is already used for the internal display. It’s honestly the biggest complaint I have about my M1 MBP. Yes DisplayLink or whatever it’s called exists but the performance is bad.


But - and I can't believe I forgot this in my other reply - this is one thing that really grinds my gears about Apple's releases since 2018, on everything except the Mac Pro (both 2019 and 2023).

They hard-code one DisplayPort stream to *something* other than Thunderbolt.

On laptops, they hard-code one via eDP to the display, which is useless if it's in clamshell mode.

On Mac Mini, Mac Studio, and the MacBook Pro with HDMI port, one stream from the GPU is hard-coded to the HDMI port. If you want maximum displays, always has to be from HDMI.

But neither the 2019 or 2023 Mac Pro have this limitation. Even on the 2019 model where the HDMI ports were physically on the card - they could route all video streams via system TB3 ports.

I just checked and the base M2 Mini, and the M2 Pro MBP seem to finally allow using two video streams over the TB4 ports, - but the M2 Ultra Studio, with the exact same SoC as the M2 Ultra Mac Pro, still has this stupid artificial limit.


I assume you got one of the original M1 MBPs. The more recent models have more display blocks. M1 Pro can drive two external displays, and I think the M1 Max can do three or four. I’m still slightly pissed Apple ever shipped the original M1 MBPs, they were horrible machines.


It is really my only complaint about it, and since I needed a new laptop at the time it still made sense. I would definitely rather one of the 14” ones now - but not enough to buy a new device.


Although I wished i would have gotten the 15” and a bigger hard drive, I’m still very happy with my first gen 13” mbp. Also the fact that I can just plug in usb c/Thunderbolt monitors and the screen just instantly displays and windows reconfigure without flickering is amazing.


Ugh, right. Probably a freudian slip that I just mentally pretend those configs didn't come after half a decade of ubiquitous multi-display support on Macs.


Unfortunately, although it can work with certain Thunderbolt devices, many TB4 docks with the video path based on MST don’t work, causing havoc for mixed environments


Doesn't MST merely chop up the bandwidth of the lanes you have? So why would you update different parts of a single display using MST (each stream only getting part of the bandwidth of the link), rather than using the whole link to update the display from top to bottom at the same pixel depth and clock?


From memory they use it on things like the Pro Display XDR (possibly some 5K’s too?) where the display supports DP1.2;

My understanding is, the spec doesn’t allow for 5K/6K at 60hz but does allow for MST to send two streams at up to 4K/60, so it’s a creative use of what the spec supports to allow a higher resolution than envisaged when the spec was published.


You can daisy chain the Studio Display and Pro Display, and you could daisy chain the old Thunderbolt Display. Is that using some custom thing over Thunderbolt rather than MST?


Yes that’s Thunderbolt daisy chaining.


Yep. Super annoying. I have a Dell WD22TB4 which works great to drive 3 monitors for everything, except my Mac.


Can we please get monitors that can find their signal within 100ms?

Can we please get monitors that communicate their orientation (portrait/landscape etc.) to the computer?


This is truly disgusting, I bought Samsung U28R55 and it takes over 15 seconds to switch between HDMI1 and HDMI2, HDMI2 and DP, or between DP and HDMI1. When it lost signal it become unresponsive, monitor menu disappears, any debug info disappears and you can only push joystick to switch video source to another one, so if I need to switch between DP and HDMI2 it takes almost one minute and user experience is joke.


The Apple Studio Display autodetects rotation and lets the connected computer know so it can adjust the picture accordingly, but it connects over Thunderbolt. It should be possible for monitors to do this over USB-C too, but I’m not sure about plain DisplayPort or HDMI.


sigh

Ok FINE I will move the goalpost

I want all that in a monitor that doesn't cost more than a typical modal monthly income in Europe


Not Europe but Latin America, still even taking the US into account... man, the monitor market is so bad! Mass items still stuck in low resolutions, 4k goes way up, and not so many options.

I mean I get it's not the TV market, but it does feel more niche compared to 10 years ago.

The earliest used iMacs with 5K displays seem consistent options on the affordable side tho, and better than test-driving a possible lemon from DELL/HP/Spectre or a middle range panel from Samsung/LG.


> it connects over Thunderbolt

Not exclusively—it supports regular DP Alt Mode, though no idea if those advanced features also work.


I2C (Aux) could do this.


> Can we please get monitors that can find their signal within 100ms?

TVs too—with various HDR standards, VRR etc becoming more popular these days I quite often find myself staring at a blank screen for 5-10s


With Smart TVs being the only option these days, I'm thinking about building my own TV from a computer monitor.


And don't go standby after 1s of signal loss.


I just like the little retaining clips, and the satisfying little click you hear/feel when plugging them in.


It’s a problem when the x16 slot is the bottom-most and the orientation is such that the clicky thing you must depress is on the bottom side of the DP connector when plugged in, and your case has a ledge after the PCI slots so you cannot realistically depress the latch.

Of course, no one would realistically encounter a case/GPU/cable combo like that. Right?


Your GPU should be in the x16 slot closest to the CPU because it has the most bandwidth.

tbh this sounds like a self-inflicted SFF build problem


If you have multiple GPUs in a box, you often have to use the bottom slot as well as the close one. (Passing through dedicated GPUs to different VMs is one case that I’ve experienced.)


> tbh this sounds like a self-inflicted SFF build problem

No, the error is in automatically assuming that the GPU should be the one to get the x16 port. This is a tower server; the x16 and x8 ports are for the SAS cards, network cards, and NVMe drives. The GPU takes whatever is left.


Hm, my sample for that situation clocks in at about 100% (n=1). Especially the first time I needed to unplug that cable I was thrown for a bit, I ended up taking the whole case off the floor and using a thin screwdriver to depress the release button from the exposed side, that did the trick.


I have a love/hate relationship with them for when the release tab is in a hard to reach area or doesn't release well, but that's usually the cable's fault. Love that tactile sensation of it being properly mated.


> A carefully shaped money-backed lever over the market is absolutely part of reason you never see DisplayPort inputs on consumer TVs, where HDMI group reigns supreme, even if the TV might use DisplayPort internally for the display panel connection.

Curious about this. Is the HDMI standards group engaging in anti-competitive behavior to prevent DisplayPort from taking over on TVs? I've always assumed it was just momentum.


> Is the HDMI standards group engaging in anti-competitive behavior

They don't need to, the whole industry is behind it:

> The HDMI founders were Hitachi, Panasonic, Philips, Silicon Image, Sony, Thomson, and Toshiba.

> HDMI has the support of motion picture producers Fox, Universal, Warner Bros. and Disney, along with system operators DirecTV, EchoStar (Dish Network) and CableLabs.


Another key is that HDMI carries more power than display port. There are dongles for your tv that can get all the power they need from an HDMI cable. Not so far display port.

When it comes to use case, display port is better when connecting a fully independent device to a display and HDMI is better for connecting a peripheral to a system


> Another key is that HDMI carries more power than display port.

A quick look at Wikipedia's article for both standards shows that you're wrong:

  DisplayPort: 3.3 V ±10%, 500 mA
  HDMI: 5 V, 50 mA
Which means at least 1500 mW for DisplayPort, and only 250 mW for HDMI. So no, the DisplayPort cable carries more power than the HDMI cable.

And when you think about it a bit, it makes sense that DisplayPort carries more power than HDMI: one of the use cases for DisplayPort is an active converter to HDMI, which needs power for both its electronics and the HDMI cable.


I was just thinking about USB-C vs Lightning... What if the EU were to mandate HDMI over DisplayPort?


The EU Generally chooses the more open and cheaper alternative. They did this with phone charging, car charging and would probably do the same with display connectors.

USB C is the cheaper and more capable connector


I just wish there was only one connector. Either HDMI or DP. I don't care.


It'll be the USB-C port.


For laptops it's pretty clearly USB-C with DP alt mode, just way too convenient to use a display as a charging hub.

Desktop CG manufacturers don't care and it doesn't really drive sales (NVidia has completely removed it, and AMD's tends to be pretty buggy, it's really just DP over a USB-C port) so you need conversion add-in cards (usually from your mobo manufacturer).

Also display manufacturers, standard connectivity seems to be 2xHDMI 1xDP and a USB-C if you're lucky (with a few cool oddballs providing 2xDP 1xHDMI instead). Pretty hard to do all-USB if that means you can't plug more than one machine to your display.


Many connectors in one. Is the video output the left port or the right one? How about the charging port?


Just make all ports fully capable?


You want the laptop to have 8 GPUs just in case someone plugs a display into every port?


That's (1) a strawman and (2) what do you care what other people want to do with their computers?


I think very few people will buy the laptop with 8 GPUs because it will be expensive. It's not a strawman - it's what you asked for. Or what did you mean instead?


GGGP did not suggest to add 8 GPUs. That was your invention. They wrote 'Just make all ports fully capable?' which could be accomplished with a single GPU and the limitation that you can only plug a display into one port (or maybe two if your GPU is dual display capable).


I recently switched DisplayPort connecting my laptop to the monitor for USB-C, didn't see any difference, except now I get video (4kx60hz, max for my monitor), audio, can connect mouse and keyboard directly to the monitor, and the cable is thinner and lighter. Am I missing on something?


Don’t forget power delivery! Having a truly one-cable docking setup is a godsend if you need to switch between devices regularly.


DisplayPort and HDMI's max length is 15m, compared to USB 3.1's 3m.


That very much depends on how much bandwidth you're pushing through the cable, older versions of HDMI could tolerate long cable runs but HDMI 2.1 is also limited to about 3m unless you use an expensive active optical cable. DisplayPort 2.0/2.1 is taking forever to come to market but it's going to be similarly limited in its high bandwidth modes when it does arrive.


40G FRL5 (4K @ 144Hz) seems to work just fine for me with a 8m HDMI 2.1-certified cable (copper). But I had trouble finding longer ones from the same vendor which makes me suspect it is getting close to the limit.


They aren't that expensive... 100' is $72 on Amazon.

https://www.amazon.com/Highwings-48Gbps-Dynamic-Compatible-D...


I just wish the Google Pixel line supported display out over USB-C.


According to the recent leaks, Pixel 8 line will have DP out. So your wish might come true.


and USB-C allows the transmission of displayport over USB-C


It also allows the transmission of HDMI over USB-C, so you still have two competing standards


It allows it, but can you actually buy any products for it?

https://www.notebookcheck.net/The-demise-of-HDMI-over-USB-C-...

> True USB-C to HDMI adapters are no longer going to be a thing. The HDMI Alt Mode is more or less history, and DisplayPort has won. Notebookcheck spoke to HDMI LA and the USB-IF about it.

> HDMI LA said that it doesn't know of a single adapter that has ever been produced. Similarly, at the USB Implementers Forum (USB-IF), people who are familiar with the certification process have yet to see a true USB-C to HDMI adapter.


There is also MHL alt mode. The HDMI alt mode was horrible because it used the whole USB-C cable as HDMI cable preventing any other users. MHL is HDMI but over a single pair.

My understanding is that all the USB-C to HDMI adapters are using DisplayPort because that is more widely supported by devices. And the conversion chips are just as cheap as MHL to HDMI.


There's an oscilloscope screen that does the conversion in software and was the bane of my existence last year.


> have yet to see a true USB-C to HDMI adapter

Not sure what this article means. I have a USB-C to HDMI adapter (made by Apple) and a USB-C to HDMI cable and both work.


The article talks about how adapters like yours use the DisplayPort Alt Mode protocol and then have a DP to HDMI adapter integrated. Rather than implementing the HDMI Alt Mode protocol which also works over USB-C.


There's also Thunderbolt over USB-C that does DisplayPort, but it doesn't support all the same versions as DisplayPort over USB-C. And there's also USB 4 but I don't know/understand if it changes anything.


I think the most applicable bit is that USB4 adds tunneling modes, aside from alternate mode.

I'm not entirely clear, but from my understanding alternate mode means the physical connection gets switched over, while tunneling means the tunneled data is sent over USB (so the communication on the wire is USB all along, you get nesting).


This is essentially correct, although the encapsulation format is really the Thunderbolt encapsulation format, which is only USB in that it is now officially defined in the USB4 specification.

USB4 hubs/docks for example, need to have the ability to translate from encapsulated DisplayPort to alternate mode for its downstream ports. USB4 hosts need to support both encapsulated and alternate modes in the downstream ports. The idea is that if a display device does not want to implement any other non USB 2.0 peripherals (2.0 has dedicated lines so it can support those), it can implement only alternate mode, (and not need to support the complexities of encapsulating USB4), plus all USB4 hosts needing to support DisplayPort means you know you can connect such a screen to any USB4 device and have it work (although supporting multiple screens like this is optional).

One thing to not though is that DisplayPort alternate mode has the option of reversing the device->host wires of USB-C lanes and thus get 4 lanes of host->device data, for 80Gbps at Gen 3 speeds if using both lanes.

USB4 V1.0 does not support lane reversal, and USB4 V2.0 can reverse only one bidirectional lane, since it still needs to support device->host data. I think this lane reversal is only possible when using the new new Gen 4 speeds, which provided 80Gbps symmetric, or 120/40 Gbps asymmetric.


The problem is these two competing standards solve overlapping but distinct problem sets. DisplayPort was designed to be used for computer monitors, while HDMI was designed for home theaters.

This may not seem like a very meaningful distinction to most people, but it would become readily apparent to anyone trying to design a home theater around DisplayPort instead of HDMI. Off the top of my head, DisplayPort lacks any equivalents for ARC, Auto Lipsync, and CEC. Odds are your home theater makes use of at least two of these features, even if you don't realize it.


no vendors implemented it, so whenever you are currently doing hdmi over usb-c it is actually hdmi over dp over usb-c


Which is fine and should stay well-supported because HDMI is also here to stay. The trend for TVs is to include HDMI ports but no displayports.


The article mentions the death of HDMI altmode. Is it actually widely supported?


It seems not, I missed that HDMI altmode is still stuck at HDMI 1.4 and they're not making any attempt to bring it up to HDMI 2.0. DisplayPort indeed won in that case.


That's okay as well as long as it's consistent


I've always been surprised there aren't more receptacles out there that accept either a DisplayPort or HDMI plug. I've only seen it on one motherboard, but in retrospect, it seems obvious that the DisplayPort plug was designed to facilitate such a thing.


Are you sure you're not thinking of DP++ ? https://www.kensington.com/news/docking-connectivity-blog/di...

Its a displayport connector, but the port itself can become an HDMI or DVI port purely with a passive adapter.


Do you happen to know what the model of the motherboard was? I've never heard of a receptacle that accepts both HDMI and DP


Unfortunately I don't, and it's a tricky thing to Google for, but I believe it was something industrial where space was at a premium (I work in robotics).


Oh, but there's a third. MiniDP. I have a pile of useless cables/adapters now, since Apple started using them but replaced them with USB-C back in 2015.



Oh, but that's not a different video interface.


Why don't you care if it's a worse one?


> in fact, the first time we mentioned eDP over on Hackaday, was when someone reused high-res iPad displays by making a passthrough breakout board with a desktop eDP socket. Yes, with DisplayPort, it’s this easy to reuse laptop and tablet displays – no converter chips, no bulky adapters, at most you will need a backlight driver.

I'm a bit confused by the linked project's PCB [1] - if all that's needed is a backlight driver, why all the differential pairs between the microcontroller and the eDP FFC?

[1] https://hackaday.io/project/369/gallery#9e4a0fef705befb8030b...


The schematics are available [0]. It looks to me like the MCU is doing some sideband control, but the DisplayPort signal itself is passed through unmodified.

[0] https://github.com/OSCARAdapter/OSCAR


I was very disappointed that HDMI transfers a constant bit-rate stream of pixels - VGA, but digital. I expected that, given most display devices can at least hold a full frame in-memory, that the stream could be limited to the parts of the image that changes, allowing higher frame rates when less than the full frame changes.

That, and generating a signal could be much simpler than bit-banging on a DAC.


> Plus, It’s Not HDMI

This. HDMI and its cartel that profits from its patents are super annoying.


> However, if you don’t need the full bandwidth that four lanes can give you, a DP link can consist of only one or two main lanes resulting in two

This is how many USB C docks operate. The USB C connector also has four high speed lanes and there's a mode where two are assigned to carry DisplayPort data and two are assigned to be TX and RX respectively for USB. Until DP 1.4 appeared, this meant you were quite limited in display resolution if you didn't have Thunderbolt and wanted faster than 480mbps data speed. With DP 1.3, two HBR3 lanes can carry 12.96 Gbit/s which is almost exacly the requirement for 4k @ 60Hz at 12.54 Gbit/s. DP 1.4 adds compression on top of this. One more beauty of DisplayPort is it's point to point so it's entirely possible your USB C hub will carry the display data from the host to the hub as compressed HBR3 data over two lanes and then hand it out to two monitors over four uncompressed HBR2 lanes to each so a modern USB C hub can, without Thunderbolt, drive two 4k @60Hz monitors and still provide multiple gigabit speed data. It's a very neat trick. This needs full DisplayPort 1.4 support including DSC in the host, for integrated GPUs in laptop CPUs this means AMD Ryzen 4000 and newer or Intel Tiger Lake and newer (older laptops with discrete GPUs might have had it, too).

Handy tip: if your hub is DP 1.4 and drives multiple monitors then it's most likely using a Synaptics MST hub to do this (almost all non-Thunderbolt ones do and many Thunderbolt ones as well) and Synaptics provides a very very little known diagnostics tool called vmmdptool (available in the Windows Store). It doesn't replace a full DP AUX protocol analyzer of course but it's free and for that price it's really handy.

This topic is dear to me because I have fixed the USB C specification related to this and allow me to be damn proud of that: it used to erroneously say the USB data speed in this mixed functionality was limited to 5gbps but it is not, the limit is 10gbps. https://superuser.com/a/1536688/41259

Ps.: When I say Thunderbolt, I am well aware of how Thunderbolt 4 is just USB4 with optional features made mandatory. It's not relevant to the discussion at hand.

Pps.: DisplayPort is the native video standard for USB C, C-DisplayPort adapters and cables are basically passive because they just need to tell the host how to configure the connector lanes. Meanwhile all USB C - HDMI cables and converters are active which constantly work on the DP signal to become HDMI. DisplayPort++ alas is not implemented with the USB C connector. For this reason if any compatibility issues arise it's always better to connect a USB C device to the DisplayPort input on a monitor. A HDMI alternate mode was defined in the past but it remained paper only and it has been declared dead this year at CES.


> Ps.: When I say Thunderbolt, I am well aware of how Thunderbolt 4 is just USB4 with optional features made mandatory. It's not relevant to the discuss at hand.

Dear God, I hope this situation settles down in the near future. As it is I have years of USB-C-looking cables that all do different things but are visually indistinguishable.


I wish the manufacturers would just adopt the USB IF marketing names and logos. https://i.imgur.com/H3unbD5.png would be a lot simpler.

I also wish the USB IF defined colors for high speed lanes absent vs 5/10/20 gbps capable high speed lanes and then 60W/100W/240W power. All it needed were two color bands on the plastic part of the plug. If colors are too gaudy then go Braille-style, have a 2x2 grid on top and bottom of the plug where bit 0 is a little hole and bit 1 is a little bump. That's 16 possible data speeds and 16 possible power levels and so far we have only needed 4 for data and 3 for power.

Intel could've added a separate row for Thunderbolt.


If you get a thunderbolt 4 cable it can do everything, at least until the 80gbps one comes out


Be nice if they weren’t 10x the price of normal USB-C cables.


You have to pay to play. Otherwise you get cables made on the cheap by Chinese manufacturers who cut corners


What is the benefit of adding compression here? Does that mean that the bandwidth might or might not be sufficient depending on which pictures are being shown on the screen?

If so, that doesn't seem very reliable and if not, what's the point of compression?


> DSC is most often used in a Constant Bit Rate (CBR) mode, where the number of coded bits generated for a slice is always a constant that is generally equal to the bit rate multiplied by the number of pixels within the slice

https://vesa.org/wp-content/uploads/2020/08/VESA-DSC-Source-...


Ah, so basically this is typically used as lossy compression with a constant bit rate, which makes the bandwidth usage predictable/constant but depending on the pictures being shown (and the noise level?), it can lead to some visual degradation (which they say is usually imperceptible).

That's interesting and surprising to me.


If you consider how ~1:18 is considered visually lossless for x.264, a compression of 1:3 is very very likely can be achieved visually lossless indeed.


Is it not compressing an already compressed image though?


No, without DSC the host sends raw pixel data over DisplayPort.


I personally prefer HDMI or DVI, I don't like DisplayPort because on the display that I have, when I turn off the display the computer detects it as if I've physically disconnected the display, which then moves all the windows that I had open around and breaks my macros...


I /think/ this is related to the EDID support or configuration.

What I do know is that you need specific EDID support when using a DisplayPort KVM because that switches the computer away on a regular basis. If you have multiple screens and a single screen KVM (fairly common) it will do the re-arrangement that you've experienced. With EDID support it keeps both output alive resulting in no re-arrangement.

EDIT: It seems the correct term may be "EDID emulation".


> If you have multiple screens and a single screen KVM (fairly common) it will do the re-arrangement that you've experienced. With EDID support it keeps both output alive resulting in no re-arrangement.

Can you tell me some KVM switches that have a working implementation of that feature? I'm looking for one.

I'm still using a DVI KVM, because when I tried to upgrade to Displayport (years ago), I ran into the problem you describe and gave up.


I have a TESmart (TES-HDK0402A1U-USBK) from https://www.amazon.com/gp/product/B08259QL5J/ref=ppx_yo_dt_b... which works fine in this regard. This was probably the 4th DisplayPort KVM I tried due to keyboard issues with the others.


I'm looking too, but that one linked has HDMI out only, with DP and HDMI in. Don't you lose the benefit of DP with this setup?


It has been OK for my needs, but I think you are best getting the KVM and returning if it doesn't meet your needs. I'd probably be more concerned if this was for my photography or gaming machine. For a work setup it has been fine for me.


Sounds like an OS problem. When I do that it moves all open windows to the remaining display, but when I turn the other display back on it just moves them back. No special software installed for this, just stock macOS Ventura.


But when I turn off my displays that are on HDMI or DVI or VGA, the computer still detects them like normal, like they're connected, because they are.

I don't understand in which case the 'act like it's physically disconnected' behaviour would be more desired than what we had with all the standards before.

I have read that some DisplayPort displays do have an option in the settings to disable this behaviour.


> But when I turn off my displays that are on HDMI or DVI or VGA, the computer still detects them like normal, like they're connected, because they are.

But since they’re off, that doesn’t make any sense, now you can’t reach any windows on those monitors. I think the macOS behavior make the most sense, move them so you can access them, but move them back once the display is turned on.


Or, you know, don't do any of those silly dances and just let the windows stay where they are the whole time between turning the displays off and on?

Reminds me of that problem with video driver update on Windows when the screen is momentarily resized down to 1024x768 resolution and then instantly goes back to 2560x1440: all the non-maxed windows get shrunk down and shifted to the upper-left corner (so they would be visible on a 1024x768 screen) and then they just stay like this. It's totally useless and actually quite annoying.


This nonsense is solved in Windows 11


Sounds like a Windows problem once again...


> But since they’re off, that doesn’t make any sense, now you can’t reach any windows on those monitors.

Perhaps I want to turn off the displays to darken the room.

I don't want anything done with the programs/windows except not look at them.


It's not an OS problem, that's how DP is supposed to work and that's in fact stupid


It’s an OS problem for not handling monitors being disconnected and reconnected properly.

What is stupid is having half your windows unreachable because they are on a monitor that is turned off. How does that help anyone?

Imagine how annoying it would be on a laptop? I use my laptop in a meeting, and have a few windows open on its screen, then I arrive at my desk and plug in my 2 external monitors and keep my laptop lid closed. Imagine if the windows on the laptop display stayed there, they would be unreachable and would have to manually move them to my other screens every time I moved from a meeting to my desk. Same for the other way around, I disconnect from my monitors and take my laptop to a meeting, and then I can’t access any of the dozens of windows I had open on my 2 external monitors.

That would be an absolutely brain dead way of working.


>Imagine how annoying it would be on a laptop?

>That would be an absolutely brain dead way of working.

For starters: the experience I want on my laptop is not the same as the experience I want on my desktop. I have two monitors at home, sometimes I like to turn off the wing-monitor when I am watching a movie on the center screen. (To avoid distraction, and also minimize glare, etc.) That doesn't mean I want those windows rearranged: that's probably going to interfere w/ the movie that is playing, fuck up my desktop wallpapers, icons, window sizes, etc. The whole point of turning off the monitor is to hide those distractions.

Also not all window managers suck as badly as Mac OS and Windows. By default I have odd-numbered virtual desktops go to the left montior, and even-numbered virtual desktops go to the right monitor. If I want to move a viewport to another monitor, I renumber it accordingly, and all the windows move to that monitor: complete w/ their proportional positions and sizes.

The idea that _a device hotplug event_ would change how my virtual desktops are laid out is so absurd to me that I switched operating systems to avoid the default Windows behavior. So maybe consider that paradigms and workflows other than "a docked laptop" exist before calling people braindead?


> that's probably going to interfere w/ the movie that is playing, fuck up my desktop wallpapers, icons, window sizes, etc

That sounds like a Windows problem.


You are talking about physical monitor unplugging, while others are talking about monitor power off, but still plugged in.

I agree that when monitor is physically removed, the windows should go back.

But if it is just powered off but still plugged in? I say windows should stay there, just like HDMI did. For example I sometimes turn off monitors for less distractions during regular conversations.. or to save power before leaving.. in this case windows jumping all over the screen are the last thing I want.


Makes sense to move the windows if you close a laptop lid, but not if you turn off a monitor. You can't disconnect the laptop screen, but you can easily disconnect the monitor.

I have a cheap laptop that doesn't understand a turned-off monitor as unusable, and it's actually way nicer that way.


Why is it nicer? What is the use case for having unreachable windows?


But my windows aren't getting messed up, once I turn the monitor back on it restores the previous state.

And the use case is simple: I have 2 monitors hooked up to my machine but I don't always need 2 monitors. I have a 34" 5k2k ultrawide and a 27" 4k in portrait mode. When I'm coding or using my computer for an extended amount of time I turn both on, but when I just want to quickly write an e-mail I only turn on the main monitor. I mostly use the ultrawide and have the portrait monitor to the side to dump documentation and other materials I need for quick reference on. Right now it's 29ºC in my room and I don't want to turn on more equipment than needed.


This seems like a very specific situation, and you could disconnect the second monitor for it.

Not sure about the reliability of window restoration. It's at least better in the latest macOS than in older versions.


Dive under my desk and unplug the monitor from the Thunderbolt 4 hub or just press the power button.


So that I don't see them while they still stay where they are! For example, preparing for a slide-show is one example: move PowerPoint to the second screen where it'd stay even if there is actually no second screen.


Not having your windows get messed up when you turn the monitor on/off, which is especially relevant for TVs and projectors. On the flip side, what's the use case for connecting to a powered-off monitor and not using it? If I don't want to use a monitor, I won't connect it.


Is this specific to DP? I thought HDMI was the same.


That's a failure of the display or your GPU, not necessarily the port. I've got several monitors with DisplayPort on a few different GPUs that don't do this behavior. Its not something inherent with DisplayPort.


While not specific to DisplayPort, I've been really impressed with the AV Access 8K KVM Switch [1]. 4K at 120Hz without chroma subsampling was a hard requirement.

I use it with macOS Ventura and a Windows 11 desktop. It works nicely in conjunction with a Dell Thunderbolt 3 dock to power an LG OLED and support additional accessories. And it has EDID emulation, which is crucial for maintaining consistent display arrangement.

I've tried other DisplayPort KVM switches with mixed success. This is the first one that's worked out of the box.

[1] https://www.avaccess.com/products/8ksw21-kvm/

[2] https://www.amazon.com/dp/B0BPRZPFM6?


The only downside with DP is just devices that actually have that versus HDMI / Mini hdmi.

This post though has made me annoyed as DP is clearly the better standard.


Agree with you there. I'd be much happier if my Raspberry Pis, cheap USB-C hubs, cheap monitors, etc. all had DP instead of HDMI.

It's extra annoying when you realize that you're paying more for royalties and sometimes additional hardware (e.g. in a USB-C hub that uses DP internally but converts to HDMI for the output) just to get an inferior interface.


Honestly, it's going to be one of those things I champion now whenever it comes up in conversation haha. Very solid points.


Don't forget, displayport is worldwide royalty free, not hdmi (similar issue than mpeg/av1, arm|x86/risc-v, etc).


The reason why I switched to using the DisplayPort is that it’s the one port that supports higher refresh rates ont my Dell monitors. I didn’t realise it was more “friendly” than HDMI (which comes in various quality standards).


A friend of mine had problems with his monitor, it would go into standby randomly when running certain games.

I asked him for details about his setup and I was pretty confused when he said the connector at the back of his monitor had screws but I saw the Nvidia control panel reporting HDMI.

Turns out he was using an HDMI to VGA adapter ! (GPU was outputting HDMI and screen consumed VGA).

I just told him to go buy a display port cable (since his GPU and monitor both had it) and he had no problem since.

Does anyone have insight about why the adapter could cause monitor standby in certain games ? It is really a peculiar issue.


Huh, I have the same issue with pure HDMI. No clue what the reason is.


They specifically mention 3 distinct DP connectors - standard, mini, USB-C - plus a bunch of Apple versions of them. That's NOT a feature and is exactly why I prefer HDMI. I don't care about most DP capabilities, I just want to plug in my stuff and have it work. I use 55" 4k TV for a monitor and it works, even with the open source AMD driver. I saw nothing in TFA to make me think DP would benefit me, even if some technical details are better.


But, HDMI has multiple connectors too.

HDMI, Mini HDMI, Micro HDMI, USB HDMI Alt-mode and there are also some less popular ones.


> A carefully shaped money-backed lever over the market is absolutely part of reason you never see DisplayPort inputs on consumer TVs, where HDMI group reigns supreme

What does this actually mean?


HDMI is licensed, Displayport doesn't need one. I'm sure there's money changing hands in the TV sector of most display manufacturers.


If we lived in a just, and virtuous world we'd all be using cheap coaxial cables with BNC connectors via whatever version of SDI is the highest bit rate.


SDI can't do any communication between the source and sink. You can do a limited form if you use two SDI cables (one upstream, one downstream) which is what stuff like BlackMagic's bi-directional micro converter does, but IIRC that's only used for talkback to the camera operator and HDMI CEC messages which is how they implement things like their ATEM Mini consoles triggering recording - not for DDC-style format and framerate negotiations.

Also, device manufacturers don't do SDI on consumer devices because SDI is by definition unencrypted and uncompressed, so it's at odds with HDCP.

[1] https://www.blackmagicdesign.com/products/microconverters


BMD has done some innovative things through their SDI port, specifically enabling camera control with an Arduino-add-on board.

Someone finally did what I've wanted to see for years: built a follow-focus unit that controls the focusing motors in the lenses, so you don't have to bolt a ridiculous contraption (and janky focusing-ring adapters) onto a lens to turn its (non-mechanical) focusing ring manually.


What I want to see is someone dump the FPGA in these Micro Converters and hack it to add an USB interface. The Micro Converters are all just the same in the interior: SerDes units and clock/redrivers on the I/O ports and an FPGA that additionally has an USB connection which is used to power the converter but could also be used to do other things.

That FPGA should be powerful enough to do a lot of interesting things - anything from fooling around with commands (like with the Arduino board) to image manipulation (e.g. a watermark embed).


12G-SDI of 2015 is the first SDI revision to surpass DVI from the 90s in terms of bandwidth. 24G-SDI, which seems to only exist on paper so far, has about one quarter the bandwidth of a DisplayPort 2.0 link.

It's unsurprisingly annoying trying to compete with a bundle of cables using only one wire.


That would suck. I'd need at least 4 coax cables for each monitor.

Btw DP cables are often twinax and eDP cables are often microcoax.

In a just, and virtuous world we'd all be using cheap SMF cables with LC connectors


> If you see a DisplayPort interface on your board where you’d usually expect HDMI, that might just be because the company decided to avoid HDMI’s per-device royalties, and do firmly believe that this is a good thing.

Wikipedia says DP 1.3 requires "mandatory Dual-mode for DVI and HDMI adapters". Does this mean that all DP 1.3 ports must output HDMI (and presumably pay HDMI licensing fees) as well?


HDMI predates DP by 5 years, switching wouldn't enable any specific use case, and that 5 years head start means HDMI has market inertia.

Yes we should use it because it's better but the practicality? There's so much gear that's HDMI that would have to be replaced or require new active adaptors. It's so much industry and consumer effort for such a marginal gain.


My monitor is fairly old, but it only does 30Hz on the hdmi cable and 60Hz on the display port cable. I'm not sure the exact versions. I think its displayport 1.2 and HDMI < 2. So for me, it was super important to get display port out.


> HDMI predates DP by 5 years...

- HDMI v1.0 initial release = 2002-12-09

- DisplayPort v1.0 initial release = 2006-05-01

To be sure, just under 3.5 years; HDMI rolled into v1.3 a month after DisplayPort v1.0 saw the light of day. Agreed on the impact of market inertia.


HDMI's marketshare has nothing to do with its headstart: HDMI is backed by the entire entertainment industry and was designed for televisions, while DisplayPort is backed by VESA and other computer hardware manufacturers and was designed for computer monitors.

For every computer monitor there are hundreds if not probably thousands of televisions.

HDMI's marketshare has everything to do with who the players involved are and just how much weight they have to throw around. Even computer hardware generally have more HDMI ports than DisplayPort ports.


Forget all the other pros/cons - MST makes it worth any industry & consumer effort alone.


How many people use multiple external displays? For the power users who do, there are already good enough solutions.


I don't have data on this but if my anecdata is anything to go by, power users aren't your typical multi-monitor users...


HDMI also supports a lot of home theater specific features that DisplayPort does not.

DisplayPort never could have replaced HDMI because DisplayPort never tried to solve the same problems HDMI did.


Praise displayport all day if you want, but the connector sucks and the protocol is too complicated (protip: when something in IT is called "powerful" you probably want to steer clear of it).

It is much more convenient and practical to have data streams like VGA and DVI have.


I hate HDMI, I had many problems with this interface, in my prvious job my monitor was turning off when I was getting up from chair, same was happening in my home with differeng PC, cables and monitor. Such a thing never happend to me when I changed interface to Display Port


Good and interesting video to explain the version diffs in spec: https://www.youtube.com/watch?v=Vn2vdQZhs0w&ab_channel=Linus...

And even better, a calculator based on your monitor capabilities that helps you choose: https://linustechtips.com/topic/729232-guide-to-display-cabl...

I ended up with a $15 2.0 10 foot cable from Monoprice. Fit my ultra wide Samsung G9 pixel pushing needs just fine.


The article missed the most important advantage of display port over hdmi, you can get locking connectors.

I joke of course, but of the two standards bodies vesa always feels like they were less interested in playing the drm game and more interested in making a useful standard.


It’s mentioned towards the beginning:

> Physically, desktop/laptop DisplayPort outputs come in two flavors – full-size DisplayPort, known for its latches that keep the cable in no matter what, and miniDisplayPort

But yeah, the latches are great.


Sorry if that sounds stupid, but why isn't Ethernet used for this sort of high bw signaling?


Because (consumer) ethernet is low bandwidth compared to whats needed for video. DisplayPort 2.0 provide a 80Gbps link. HDMI is 48Gbps. Both already use DSC which is a mezzanine compression as this is already saturated for immediate needs.

Ethernet is absolutely used for media transport outside of the consumer space. Particularly where there's a need to distribute signals further than what DP or HDMI can provide. The challenge is this always comes with a trade off. Any compression will result in some mix of a loss of image quality, increase to latency, and cost. What that mix looks like differs across applications.

See https://sdvoe.org/, https://ipmx.io/, https://www.intopix.com/flinq, https://www.aspeedtech.com/pcav_ast1530/ for more in that space.

https://hdbaset.org/ is the other key tech. That uses the same PHY as ethernet, but is a proprietary signal.


The current Ethernet spec supports up to 1000 Gbps, but I think 40 Gbps is about as fast as you can find on the market right now. They could build multi Gbps wifi routers too, it's just not common in the consumer market.


Even 10Gb ethernet isn't suitable for consumers. 40Gb is practically only fiber which has never been successful in homes since it requires more careful handling.


It should be.

As soon as video data is in packets, it should be routable over a network like any other data.

Give up the custom connectors, custom cables, etc.

Need a wireless display? No worries, we can route the data over wifi too.

Need 5 displays? Thats what an ethernet switch is for. Screen mirroring? We have multicast.

Need a KVM? Well you can probably write a few scripts to change which computer your screen gets it data feed from.

I believe this hasn't happened simply because audio/video people like going to conferences to design their own standards every year, keep their licensing royalties, keep their closed club - a software-only solution over any IP network isn't going to fly.


They write that DP uses packets, but it's still an isochronous connection with guaranteed BW allocation.

For video timings with high pixel clocks, the bandwidth used to transfer a signal is very close to what's theoretically available. There's no way you'd be able to do that reliably over something like an Ethernet cable.

There's nothing sinister about how video standards are designed. From DP1.0 to DP1.4, which spans more than 10 years, all changes have been incremental and most just a matter increasing data rates while the coding method stayed the same.

They need a system that's very high and guaranteed bandwidth, high utilization, very low bit error rate, and cheap.

Even today, a 10Gbps Ethernet card will set you back $90. And that will carry less than half the data that can be transported by DP 1.4 which is old school by now.


Have you tried VNC over an Ethernet cable at all? What you are proposing is basically having a VNC-like protocol over IP over Ethernet. Now try it out and see how well it works.

To be even more reductive, a thought experiment is to consider the classic USB. Why did people even invent USB in the first place in the 1990s? The first USB had only 12Mbps, not much better than the first Ethernet at 10Mbps which had existed since 1980! Why didn't people who invented USB simply use Ethernet?


The main delay in VNC is the software stack encoding and decoding the data.

A video-over-ethernet solution would likely have the GPU directly packetize the data into IP packets and route them direct to the network hardware without the CPU touching anything. The network would have QoS to guarantee the necessary bandwidth and guarantee a fixed latency and no packet loss (ie. there is no chance of a packet being dropped due to a queue overflowing, because we have already arranged a fixed bandwidth allocation for the whole network route).


Naw it's because >=40G Ethernet never saw much consumer adoption. QSFP DACs and optics are also kinda bulky by consumer standards so a lot of people would want a new connector. Even LC connectors would be too large for laptops.


The bandwidth is much too low.


If this were true, then people would be running datacenter networking over HDMI.

Data is data - there is no benefit to designing two entirely different transport mechanisms for video vs other data.

You could argue that video data has a deadline to meet - but ethernet already has lots of mechanisms for QoS and bandwidth reservations to make sure that someones torrents don't interrupt something latency sensitive. Sure, they aren't widely supported over the internet, but between your PC and your display - they can be engineered to work properly.


> If this were true, then people would be running datacenter networking over HDMI.

Why would they? If they need more bandwidth than what Ethernet provides, they'd probably just bite the bullet and go with fiber.

...which punts the previous question to why the industry hasn't gone all-in with fiber optic display connectors, like it has with TOSLINK for S/PDIF in the audio world.


TOSLINK/S/PDIF is way out of date and no one uses it any more. Ironically, because it doesn't have enough bandwidth for things like Atmos.


Not entirely true. My house was built with an A/V closet that's not in the living room. To put analog equipment (like a turntable) in the living room where it's used, I convert the analog signals into digital with a $100 A/D converter that sends them over CAT-6 under the house to a decoder that connects to the receiver with Toslink. It works great, and I bought this in the last three years.


Sure, for PCM stereo content it still works perfectly fine. Which is a lot of stuff! But it'd be a poor choice for 5.1 or better today.


I was curious about that; because I couldn't be bothered to do the math myself, I looked up the max bitrate of multichannel PCM. Found this at https://www.videohelp.com/hd :

"Linear PCM up to 9 channels (Max 27.648 Mbit/s)"

Cat5 or higher (especially only 20 feet) would easily handle that.


Ethernet over fiber is still ethernet.


TOSLINK was a gimmick. In reality, optical transceivers are expensive and are only used when copper can't do the job.


HDMI has been used for stacking ports for years... I think it's gone out of fashion, though. https://www.dell.com/support/kbdoc/en-us/000120108/how-to-st...


HDMI is severely length limited. Ethernet has no millisecond level latency guarantees. Data isn't always just data.


25 Gigabit Ethernet costs ~100x more than 32 Gigabit Displayport.


Sounds like the real question is why don't people use DP for p2p networking links...


There’s nothing magic about DP cables, they’re just typically a short-ish length, which is why they can be high bandwidth and relatively cheap. Once you get to a decent length, the cables get really expensive because you have to put optical transceivers on either side. I have a 25 foot DP cable which cost over $100, for example.

So you could do a short length, high bandwidth Ethernet cable, I’m sure. But the reason we don’t is probably because differentiation between connectors is desired —- consumers are dumb, frankly. They’ll think any old Ethernet cable will work. Just look at what kind of a mess exists with USBC.


The commonality of connector is important. Ethernet has stopped using twisted pair cables for higher speeds. The SFP transceiver means can use short DAC copper cable or fiber optics for longer distances. The DAC cables that would compete with DisplayPort are pretty cheap.

Also, a big portion of the cost of networking is in the switch that can handle high bandwidths. My guess is that 25G DisplayPort switch would be just as expensive as 25G SFP one.


Yeah, my original comment was based on the "100x more expensive" assertion. a 5m DAC cable is $40 on Amazon, and 25GBe cards can be had for under $200. I think it's probably more than $2.40 for just a 5m DP cable, so 100x was a gross exaggeration.


It's unidirectional and short reach.


Better compression would surely help.

The cloud gaming people are telling us they can deliver 4K 60fps video over 35Mbps.


Dante AV does this. I'm sure it's expensive, but it exists fro commercial applications. https://www.audinate.com/products/manufacturer-products/dant...


With HDMI, I have both video and audio traveling over a single cable. That is _extremely_ convenient. DP _theoretically_ supports audio too, but AFAIK that's not widely implemented.


> That is _extremely_ convenient.

I'm open to having my mind changed but I have not really experienced this extreme convenience in practice. The vast majority of things I plug a HDMI cable into either don't have speakers or if they do, the speakers are a bad quality afterthought.

In actual fact, what has been an extreme inconvenience is the OS thinking I want HDMI audio instead of whatever much higher-quality* alternative output I actually want to direct my audio to.

The only time I've ever gotten high-quality audio output over HDMI is via ARC, which says a lot about the need for HDMI audio...

* When I say "much higher-quality", I don't mean HDMI is a low-quality audio transport, I mean the HDMI output devices more often than not have inferior audio output to some other bluetooth / 3.5mm jack device I am using.


Audio over HDMI or DP has never been anything but a nuisance on my windows workstations. Windows finds a way to set the speaker monitor/mic as the system defaults at least once a month for me. Every GPU driver update I end up disabling the display's audio in device manager. Hrm... maybe I could write a script to do it at boot.

I think it could make sense in budget friendly setups, but... personally I'd pay extra to not have to deal with the nuisance and security risk of microphone input I can't physically disable.


It was very convenient for me to have my headphones plugged into the monitor. Made it so I could plug a single cable over to my laptop and have basically everything else already done.


Every monitor I've ever had has had terrible coil-whine from the headphone jack. I always try them for exactly this reason, and get super annoyed each time. It didn't have to be this way!


I have this problem on my gaming monitor, thankfully it also has USB-A ports so a cheap adapter has solved it.

I have no idea if there are downsides to audio over USB-A but for my fairly basic use case (“being able to hear things and not hear coil whine”) it works pretty well.


Depends what your usb device is, all told. I have a USB C amp for audio that really is quite nice.


> headphones plugged into the monitor

That sounds cool but I've never see a monitor with a jack. Mine all have many USB ports, but not audio. How common are they?


An audio jack is very common. Some even come with speakers. They are usually not very good but they get the job done in a pinch.

I use 2 computers connected to the same monitor via hdmi and use barrier (and a script) to switch screens. The audio for the correct monitor gets activated automatically. It's very convenient.


All the dell midrange monitors I've bought (IE, $400-$1K) have audio jack for headphones.


Interesting you say Dell as most of mine are Dell (3 Dells + 1 MSI), same price range, 2 bought very recently, no audio.

I wonder is it a regional thing (in EU)


I have a 144Hz 1080p AOC monitor and a secondary generic LG one, both have an audio jack


I have a 3.5mm audio jack on a cheap 4k LG monitor I've been using for years


HDMI was designed to accommodate home theaters. That usually means a sound system with good speakers. Not every user needs this expanded feature set, but enough do that DisplayPort does not make a good alternative.

Here's where ARC/eARC really benefit me, and I don't see how this problem could be solved with DisplayPort.

I have a PC and a PS5 that support Variable Refresh Rate (VRR) over HDMI. I bought a new TV last year that also supports VRR, but I did not want to replace my perfectly adequate 3 year old reciever that does not support VRR. Even today, VRR support on recievers is questionable at best, and most users likely want their VRR devices plugged directly into their TV.

Thanks to eARC, I can plug my PS5 or PC directly into my TV and have it send the PCM 7.1 surround sound audio to the reciever over HDMI. I can still leave other devices like an Apple TV or Blu-Ray player plugged into the reciever, and everything just works.

Without eARC, I would have to fall back on TOSlink. That means extra cables, and dropping down to compressed dolby digital 5.1 if I want to keep surround sound. Using dolby digital on a game console incurs a pretty noticeable latency penalty, which is why they all default to PCM.


I have never once used hdmi audio for anything but an a/v type experience. Monitor speakers are generally garbage, monitors generally don't support bluetooth for audio connectivity, their headphone jacks are often in hard to reach places, and I have yet to find on that supports a microphone. Just easier to go directly to the machine for audio.


One extremely convenient situation for me is getting sorround sound from the TV to the AV receiver in a single cable.


I have an audio extractor that conveniently passes the sound from HDMI to my desktop speakers. The extractor is connected between my HDMI input switcher and the [primary] monitor, so whichever device I'm currently using outputs the audio to the speakers.


I use my PC to drive a TV for gaming and all I have to do is switch the sound output in Windows from Speakers to TV and I'm good to go. No need for splitters and multiple cables to get sound to both devices. The only thing I would change is to have Windows support outputting audio to both devices regardless, but keeping the settings menu open to the right page isn't really an inconvenience.

All of my consoles work the same way: just run a single HDMI cable for each one.

The counter point might be that my TV would have "bad audio." But I don't think so (at least not at the volumes I prefer), and even if it did, it supports audio out to connect the TV audio to a separate hi-fi.


TV connection is a pretty big use case where audio matters.


i have 3 computers plugged into my 4k tv (monitor) via hdmi. tv to dac via optical. works fine for audio+video at 4k60hz.


I plug my computer into my AV receiver all the time, and my speakers are worth more than my TV.


If your receiver doesn't support eARC, you are definitely not getting full resolution audio.


I don't use ARC at all (all audio sources go through the receiver), so that's a non-issue for me.


That's fine. Let me rephrase: if you're not using the latest and greatest HDMI protocol you're not getting full fidelity audio.


Besides the DTS passthrough (which works just fine), it supports 48kHz PCM from my media-center, so I don't know what more is needed.


Again, that is fine. Besides getting an uncompressed 2 channel signal; anything above 2 channels is going to be compressed and frequencies carved out of the additional channels due to the encoding process. Thus a loss of fidelity. You "may" get a multi channel signal, but it is not 1:1. There are plenty of papers written about the subject since it's nearly a 50 year old technology.


My laptop definitely supports up to 8 channels (7.1) of uncompressed PCM over HDMI. I've tested my receiver up to 5.1 (I don't have an 8-speaker setup) and it works.

[edit]

I have an Intel IGP; Intel has supported this since the G45 (Core 2 Duo era), AMD added it in the HD 4800 era, and nVidia in the GeForce 8300 era. Support for this is over a decade old at this point.

The "stereo only uncompressed" is a S/PDIF legacy, and while HDMI does support S/PDIF, it was already "the bad old way" when Bluray players came out (though it took a few years for discrete GPU makers to catch up).


Might I suggest reading up on how both DTS and Dolby encode audio then.


DTS and Dolby are unrelated to uncompressed LPCM. HDMI 1.0 supports up to 8-channels, up to 192kHz, and up to 24-bits of depth. In practice, my laptop to my receiver can definitely do 24-bit 48kHz PCM at 8 channels, and that's certainly "full fidelity" for most setups.


If you're sending audio and audio only then sure. Wouldn't be my solution from my professional opinion though.


Uncompressed multichannel LPCM works along-side video just fine. The only thing you have said in this thread that is true has been about eARC/ARC, yet you've doubled-down on your statements even when I made it clear that I'm not using ARC.


My mistake was speaking with an audio consumer and not an engineer.


Do you not have a proper home A/V setup?

Almost everything I use for media sends audio and video to my receiver over HDMI; the exception is analog sources, which go into a mixer whose output is digitized and sent over CAT-6 cable to an optical converter sitting on the receiver.


I have been using DP+Audio for the last decade with a wast array of different monitors.


I have rather old NVIDIA GPUs and Dell monitors, and both the GPUs and the monitors support audio over DisplayPort.

I doubt that there exists any reasonably recent GPU that does not support audio over DisplayPort, so if there are problems they might be caused by the monitors. I have no idea how many monitors have speakers and DisplayPort without supporting audio over DP.


Unfortunately HDMI audio is also notorious for often having serious lag, both in absolute terms and relative to the video signal.


The issue is not with HDMI. The audio data is sent between each frame. The lag comes from the fact that TVs apply post-processing to video which causes it to lag relative to audio.


Are you sure that it's related to HDMI? TVs often do processing on the video signal which introduces lag, but my Samsung TV in "PC Mode" is virtually lag-free (and I use Logitech's receiver for my mouse because Bluetooth feels slightly too laggy for me).


Seems like a bug on the tv part then. They should delay audio accordingly to the lag.


That's not HDMI causing that pain. In fact, HDMI includes specific functionality for lip sync correction and enabling devices to expose both audio and video latencies they add to the signal.


Unless the standard has been substantially relaxed since HDMI v1.3a, it sounds like a downstream issue.

Citing § 7.5 normative language:

> An HDMI Source shall be capable of transmitting audio and video data streams with no more than ±2 msec of audio delay relative to the video.


Sounds like you need to upgrade to gold plated HDMI.


Better get that Monster Cable!


Isn't the whole idea of routing the audio through the display that it can compensate for any lag due to processing of the video signal? I haven't really noticed any problems with HDMI, and if I did, I think I would blame the display.


No, the point of supporting audio and video on the same cable is for people plugging their DVD player into their TV.


All modern GPUs and monitors (that's have a speaker) support audio-over-DP. They have for over a decade.

It's widely implemented.


are you thinking of DVI or something?

I've been using audio over DisplayPort for years and years and years without issue. every device I've ever had that supports DisplayPort supports audio over that connection.


I have two different LG monitors that both support audio over the DP cable.


The audio receiver industry has settled on HDMI being the only way to get uncompressed surround sound (5.1+) from an external device, like a PC. I wish they would do that over USB, too. It sucks having to configure an external display (even if duplicated) just to get good sound.


> DP _theoretically_ supports audio too, but AFAIK that's not widely implemented

I'm using my monitor's speaker through DP since GTX 1070 (2016). Currently RTX 4070 Ti and that works too. And it's a pretty mediocre AOC monitor I have.


I have a portable external monitor that is powered and gets video via a single USB C port which I assume is using DP. It also does audio over the cable.


For some reason, outside of video, HDMI knows how to shoot themselves in the foot almost as well as the USB-IF.

HDMI CEC? Has anyone actually got that working correctly?

Anyone remember Ethernet over HDMI? Apparently that was a thing. (Not to be confused with HDMI over Ethernet, which actually has some uses.)

HDMI for soundbars? We got ARC ports (Audio Return Channel). But it was buggy, didn't support lossless audio, needed new DRM, and was bad at maintaining lip sync, so introducing eARC (Enhanced Audio Return Channel). eARC works by... scrapping Ethernet over HDMI and reusing the wires. Better get a new TV that supports it, there's no adapters (downgrading to regular ARC if any part of the chain doesn't support eARC).


CEC "just works" for me and has for as long as I've owned a flat panel TV (maybe 12 years or so?). Back in the day (before I had an HDMI receiver), I could use the TV remote to control the bluray player. These days I have a receiver, and the volume control on pretty much any device will control the audio on the receiver.

ARC was a bit more spotty, but I moved somewhere with no live TV reception, so my only use-case for it went away (everything else routes audio through the receiver).


> HDMI CEC? Has anyone actually got that working correctly

I stay in lots of hotels and take my Roku stick. Most of the time the Roku remote can control the TV volume and power using CEC


HDMI-CEC is great for TVs but when I searched recently I found that both Nvidia and AMD do not support HDMI-CEC. This was disappointing to find out when I built an HTPC for my TV.


Both CEC and eARC have worked without issue for me, for many years.


For a brief moment, being able to daisy-chain displays was so cool. Now it feels like we've regressed to wrangling HDMI cables out of a USB-C hub.


I strongly disagree with the basic thesis of this article.

HDMI is better then displayport. DVI is better then HDMI.

USB-C is a complete disaster (particularly when trying to tunnel displayport).

I have an ongoing hypothesis that the general robustness and pleasantness-of-use of a computer interface which will be implemented by multiple separate vendors is inversely proportional to it's complexity. As the interface protocol gets more complicated, it inevitably winds up with progressively more terrible implementations.

As you add more vendors implementing the same protocol, or more features to the same protocol, the likelihood of two vendors or two implementations by the same vendor finding a corner case where they fail to interoperate properly approaches 1.

I use dell laptops at work. Dell cannot get their own laptops to tunnel displayport over USB-C reliably to their own dell branded docks.

Additionally, if you have an nvidia GPU, certain displayport errors will BSOD the entire computer. *Yes, plugging in a slightly misbehaving display can crash your computer entirely*.

Displayport is a nightmare from the consumer's perspective. Will two displayport devices properly negotiate <feature>? Who knows?


I really really hope that my LG has a DisplayPort input. They do offer a monitor that is the same size and I believe the same panel, but the price increase does not really justify it. So sad that I have to bear with the hell of DP/USBc to HDMI converters...


You hope? You’re not able to look at the specs before buying?


Is there an affordable 4-port DisplayPort 1.4 switch? How are they almost 10x more expensive than HDMI variants? Paying 500€ for a KVM box is not justifiable, I only want to avoid physically re-plugging the cables.


Our work went DP on all laptops years ago but then didn't add adapters to the projectors or TVs, so everyone complained and then the newest laptops are back to hdmi and usb.

Be smart people!


So if HDMI is so restricted with NDAs, how is DisplayPort allowed to have a documented compatibility mode for it?


"We’ve all gotten used to HDMI." Well that's it then, I don't need another less common way to do the same thing. There's always that theory that low-end devices will opt for DP to avoid royalties, but the cheapest laptops and monitors tend to be HDMI-only if anything because DP is more of a special feature.

Similar story with H.264/5 vs VP8/9.


> There's always that theory that low-end devices will opt for DP to avoid royalties, but the cheapest laptops and monitors tend to be HDMI-only if anything because DP is more of a special feature.

Yeah, when I needed a new laptop dock, one of my annoyances was having to pay extra for a DP one because our lab's IT refuses to stock adapters or cables "because people keep using them instead of returning them" and they "spent extra" to get monitors which have a single DP port.


1440p 240hz is currently where it tops out. to be fair, this is a fantastic format.


Even the latest consoles like XBox Series X and PS5 have HDMI only outputs!


in my experience HDMI works better. I have had Dell 4K monitors both at my home and my work, on three different PCs all have the exact same issue: with DisplayPort I can't get more that 30hz and the mouse is so delayed that it is practically unusable. I almost sent the first monitor back, but when I decided to try the supposedly-inferior HDMI I had no trouble driving it at 60hz and the mouse movement feels buttery smooth (no latency). it's not a case of a weak video card... my card at work easily drives two of these 4K screens at 60 hz over HDMI but for some reason can't drive a single 4K monitor at more than 30 on DisplayPort. what's up with that?


Maybe it is, but I also want audio, so the point is moot.


Audio is covered as well, as written in the article, I recommend reading it.


my argument display-port of usb-c is a better interface for flexibility and utility




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: