Hacker News new | past | comments | ask | show | jobs | submit login

Incorrect. HDMI is a standard for transmitting uncompressed video, same as DVI. Raster goes in one end and out the other. AirPlay, on the other hand, does encode frames that need to be uncompressed by a renderer.



Compressed or uncompressed doesn't matter. The point is that the display device is buffering frames and then rendering them later (ideally imperceptibly later).

Airplay definitely adds more potential bottlenecks but the principle remains the same.


>Compressed or uncompressed doesn't matter

it most definitely does. With DVI/HDMI each frame sent is just that, a frame. at X the screen should be Y. If a frame is damaged/lost you have only lost the information from that one frame (or just part of the frame).

However with compressed data each frame is now intertwined with neighboring ones. if you lose a frame or its damaged, you have now have lost X frames until the dependent frames are past.

I'm not sure if you've ever had the experience of a damaged HDMI cable, but you can see the exact pixels that are affected, and they change frame to frame as you twist the cable making it worse or better.

Compressed video is like Netflix or a damaged AVI where when the corrupted data is hit the entire stream goes wonky for a short while until it suddenly snaps back into clarity when a keyframe is hit.

Uncompressed is as near real time as you can get, the video is directly passed through. Compressed you have a buffer, decoder, etc and there is more delay/processing.


In practice no digital TV actually works like this. The screen isn't instantly updated the moment the pixel data arrives on the line (which is exactly how analog CRTs work, which is why you have to adjust vertical hold, etc.)

You can still get lag on CRTs — phosphors don't light up instantly — but digital TV adds all kinds of opportunities for screwing up timing, and ironically it's the fancier TVs that do more processing (e.g. Interpolating frames, decoding 3D, et .).


But the buffering is not required because of the HDMI link; a display could update each pixel as it receives it over HDMI, just like CRTs did. LCDs buffer at least one frame for many reasons, but "because HDMI requires it" is not among them.

And compressed vs. uncompressed does matter, especially when you're talking about video compression. Because compression requires some minimum amount of data before it can even begin to compress, let alone start to send the compressed bitstream over a physical link. And then you have more latency as you add a decode step. Not to mention the complete lack of latency guarantees a wifi link has...


Compressed or uncompressed doesn't matter. The point is that the display device is buffering frames and then rendering them later (ideally imperceptibly later).


not to mention go through your congested wifi network.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: