Hacker News new | past | comments | ask | show | jobs | submit login

It's ironic that "live streaming" has gotten worse since it was invented in the 1930's. Up until TV went digital, the delay on analog TV was just the speed of light transmission time plus a little bit for broadcasting equipment. It was so small it was imperceptible. If you had a portable TV at the live event, you just heard a slight echo.

Now the best we can do is over 1 second, and closer to 3 seconds for something like satellite TV, where everything is in control of the broadcaster from end to end.

I suppose this is the tradeoff we make for using more generalized equipment that has much broader worldwide access than analog TV.




Yes, and its driven by consumers.

Unless your content operates in a very small niche, "real time" is far less important that continuity.

In rough order of preference for the consumer:

1) it starts fast 2) it never stops playing 3) It looks colourful 4) Good quality sound 5) Good quality picture 10) latency

one of the main reason why "live" broadcast over digital TV has a stock latency of >1 second is FEC. (forward error correction) this allows a continuous stream of high quality over a noisy transport mechanism. (yes, there is the local operating rules for indecent behaviour, and switch and effects delays, which account for 10 seconds and >250ms respectively)

For IPtv its buffer. Having a stuttering stream will cause your consumers to switch off/go elsewhere. One of the reasons why realplayer held on for so long was that it was the only system that could dynamically switch bitrates seamless, and reliably.

There is a reason why netflix et al start off with a low quality stream, and then switchout to HD 30 seconds in, its that people want to watch it now, with no interruption. They have millions of data points to back that up.


Google seems to think they can implement video gaming over IP. And they probably can, my ping to them is only 9ms, less than a frame.

There is just a broad lack of interest in reducing latency past a certain point unless there is a business reason for it. People don't notice 1 second of latency.


Why didn't they use that capability for voice/video communication? Are games a better business?

I remember Hangouts being better than Skype, but that's not high praise. Every calling service I've used has been laggy and disconnects often.


They do, voice and video calls are intolerable at 1s of latency.


> Google seems to think they can implement video gaming over IP.

No the didn't. Early attempts at streaming videogame were unplayable even with a server in another room or a direct DC connection.

And they want it to work over an average internet connection in America.

No, the definitely did not solve the issue


And yet, I was able to game competitively from my apartment in Brisbane, using a server in Sydney, using Parsec; usually coming in at less than a frame of latency, sometimes just over a frame. This was two years ago, too. And Australia isn't known for it's amazing internet connections (though mine was better than most).


Just because one group was incompetent doesn't mean another will be.

It has been possible for years to get a total encode+decode latency of less than one frame with x264.

Meanwhile many people are gaming on TVs that impose 3-8 frames of processing lag.

And you can beat most current tech by more than half a frame just by supporting HDMI 2.1 or variable refresh rate. (Instead of taking 1/60 of a second to send a frame, you send it as fast as the cable can support, which is 5-12x faster)


I played over 20 hours of assassins creed through chrome during the stadia beta and I couldn't notice any latency. While it might not work for games like cs go, AR, or bad networks, they 100% have a working product today for assassins creed.


> less than a frame

my 144Hz screen would disagree.


It's not surprising if you think about how our ability to store video has changed over the years. The delay on analog TV is so low because the picture data had to go straight from the camera to the screen with basically no buffering since it was infeasible to store that much data. (PAL televisions buffered the previous scanline in an analog delay line for colour decoding purposes, but that was pretty much cutting edge at the time.) Now that we can buffer multiple frames cheaply, that makes it feasible to compress video and transmit it without the kind of dedicated, high-bandwidth, low-latency links required in the analog days. Which in turn makes it possible to choose from more than a handful of channels.


We also lost the ability to broadcast things like rain, snow, fireworks, dark scenes and confetti with the loss of analog.


I missed something. What do you mean?


Compression artifacts ruin scenes with those things in them. Analog isn’t compressed so it has no artifacts.


I see.

That seems to be mostly solved with high speed links and better encoding technology.


No, it got worse. Try H265, compression artifacts are pretty bad in certain scenarios, even with high bitrate. Same with h264 - but it can be solved with high bitrate - but your file size also gets much much bigger. Which means you will need very low latency, high-speed internet.

I think youtube is the only streaming service that does it very well without any issues for the end-user, anywhere in the world. Mostly because of their free peering service that is extremely ubiquitous. https://peering.google.com/#/


He meant better encoder technology at the same bitrate.


H265 encoding/compression tech is the best in the world right now - unless I missed something.


How you encode something goes beyond the standard. You can encode the same source at the same bitrate and with the same standard in different ways.

For example, there were noticeable quality differences between MP3 coders.


I’m having trouble finding good examples of confetti or fireworks on Netflix. I’ve noticed fewer problems as time goes on, but anecdata and all that.

Different encoders and standards will have different problematic scenarios. It seems like the number of problematic scenarios is decreasing.

I can encode DVD video (MPEG2) as h.264 at a huge bitrate decrease with no apparent quality loss.

Certainly streaming real time is harder with digital formats, but it’s generally good enough.


You can't find good examples because people actively avoid recording and uploading videos containing things that are hard to encode.


I find this hard to believe. Netflix is going to avoid certain titles because they aren’t going to look perfect?


Noone's producing titles that can't look good on most platforms.


Some delay from many producers is almost certainly intentional. Live content providers want to be able to have a second to cut a stream if something unexpected (profanity, nudity, injury...) occurs on set.


Analog TV is also massively less spectrum efficient. You can fit 4+ digital channels in the same spectrum as one analog TV channel.

And don't forget how low and inconsistent the quality of analog TV was compared to what we can broadcast digitally.

The real story here is that latency isn't actually important to live TV, so it's a no-brainer trade-off to make. If you look at other transmission technologies where latency is more important, like cellular data transmission, latency has only decreased over the years.


> Now the best we can do is over 1 second

Mixer can do about .2 seconds.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: