True P2P streaming is more trouble than its worth. Take it from a guy who streams petabytes a month and has looked into it more times than most.
P2P has 2 main advantages, both of which are voided by this implementation:
1: Better latency. If Joe talks directly to Sue, that is the fastest connection possible, as opposed to Joe -> Server -> Sue
2: The host saves on bandwidth costs
Now, point #1 is moot because this needs servers to work for live broadcasts with lots of viewers. Also, even if server were not on the mix, you are still going to need your data relayed several times depending on how far from the source you are. You cannot hype away the fact that if a person is streaming a 1080p video at 1mb/s and the average peer has less than 5mb/s outbound, that the delay in getting that video to the thousandth viewer is going to be exponential unless you add in servers anyways.
Point #2 is mostly moot for 2 reasons. The first reason is the cost of bandwidth is almost dirt cheap now. You can get a unmetered 1gb/s line for under $200/month.
The second reason is that when you go full P2P, you lose out on a lot of value adds that people like. Things like transcoding for different sizes / mobile as well as DVR style seeking/recording.
Another reason which I am sure will be hotly contested by the developers is that the quality will be more unreliable than a direct server. Last mile Peers are the worst for sending data. You want to typically be in a hub for direct peering to the various end points (comcast/tw/etc.)
In most cases, Its unfortunately the case that P2P causes exponentially more headaches than it solves.
Now, that is not to say that multicast does not have its place in the world. Just that its more of a feature to compliment traditional relay streaming vs being a product on its own.
Please read the PPSP protocol more carefully, your statements are invalid.
#1 if Jose is streaming to Sue, there's no middle server here. PPSP doesn't mandate a "central server", and I hope on the swirl website I can make this use case clear too.
I'd love us to be able to stream from our devices to friends etc and not need to store everything on youtube or dropbox or some other indexed/sliced/diced/resold for profit system. With PPSP however you could have your home network be a peer, and pull down the content safe and sound with minimal impact to your phone. This type of sharing is something I would love to roll out for journalists in the field, protesters in Venezuela and Ukraine, where media suppression by the government and confiscation of phones is a sad reality.
#2 Spoken like a true American ;-). This is not the reality for 99% of the internet users. I'm unable to get more than shoddy ADSL, even in a major European city. What about India, China, Africa, South America? Places where the monthly income is not even close to $200. Hopefully that will change over time, but it's not practical for most people today.
I concur that live P2P streaming is tricky, and I'm sure you have plenty of practical experience to back that up.
PPSP allows DVR style seek / record, the way the protocol is laid out makes that almost trivial. Now if your device at home doesn't have that content, you'll need to retrieve that from a peer, and I'd like to see that peer be close to you.
Today, with bit torrent & friends, there's no standard implementation to refer to, so there's no way for an ISP to put an intermediary cache in to support the functionality you want, assuming you're paying a premium for that content, which is what is done with HTTP & friends on a regular basis.
By having a standard protocol in place, multiple implementations & vendors can share the same caching peer technology within a single telco's network. Those peers can be located at edges within the exchange, similar to how major CDNs and google etc do today.
Finally, PPSP is more than just streaming, it's a completely new transport layer protocol for the internet -- like HTTP, we're only just beginning to explore the use cases.
I feel like you are avoiding the glaring problems with P2P streaming for live video.
With that, for your first point, let me give you a scenario and please explain
Joe is on his laptop and has 5mb/s upstream connection. He wants to stream a 1080p video at 1mb/s to 10,000 people.
In the case above, without a relay server (as you suggest),
How delayed will the data be for viewer 10,001, assuming every peer has the same connection profile at Joe ?
Hint - Its going to be minutes, not seconds.
Second question: What happens when several peers in that chain disconnect at once?
Now as far as your second comment, that feels out of place. Bandwidth is bandwidth. If you are on a 100kb/s line in the middle of africa, your preference is going to be a user to server to user streaming environment since it will be reliable and not require other users with little bandwidth available to them to also stream. In fact, using a P2P system will cost them more money, not less. Its going to be inconsistent, unreliable and slower than p2s2p streaming.
good points -- but they're not the scenarios I'm referring to, and they're extremes around a perfectly usable for 90% of use cases & users.
If Joe wants to stream to Barney and a handful of people then his laptop is fine. If he want to push 10k, then obviously he'll need some help, the maths doesn't stack up any other way. Your original example (and my response to it) mentioned a pair of users with a need for low latency. 10k is obviously going to be different, and we both know that. Let's not be disingenuous here.
PPSP-TP (the tracker protocol) doesn't force Joe's laptop to be the seeder for the entire 10k, it can be managed as discrete swarms, and the tracker protocol can segment or insulate the seeding peer from a larger community if required. But as you rightly point out in this case, help is needed. There's a limit on how much can be streamed based simply on packet size & transfer, even though PPSP is much more efficient than existing protocols due to the way it handles hash management (merkle trees and munro hashes) bandwidth is bandwidth, (latency & asymmetric aside).
Again, PPSP is a transport protocol, not just a live video stream. There are applications even in areas where bandwidth is 100Kb/sc or worse.
If you're really interested in this, please read up on the protocol and contribute some much-needed real world feedback to its development. The next IETF session is, as always, publically accessible, https://datatracker.ietf.org/meeting/89/agenda/ppsp/ next Tuesday 16h10 GMT. It would be great to have you join us!
PPSP as such was designed to have shorter init cycles.
So, in principle, it is suitable for smaller assets (federated CDN, that sort of stuff).
By the way I do not understand the "servers vs P2P" story. Why not use servers in a PPSP network?
The point is pulling the content from arbitrary sources (any sources available), not to get rid of datacenters.
For #2 I'm not sure you understood DanBlake's point. Datacenter bandwidth is cheap and reliable while last-mile bandwidth is expensive and unreliable; that's why centralized services almost completely replaced P2P over the last decade. I'm willing to use a slightly more expensive system in exchange for autonomy and privacy, but that argument hasn't been well articulated here.
Things like transcoding for different sizes / mobile as well as DVR style seeking/recording.
Ogg Vorbis was designed with bitrate shedding in mind (drop bits from packets to get a lower bitrate), though I'm not aware of it being used. Are there any bitrate shedding video codecs? If not, one could be developed that allows low-bandwidth clients to request only a percentage of the blocks of a stream and decode a low-resolution/low-quality image therefrom.
Hey, I'm the lead developer of swirl project. I'd like to add that I wouldn't be working on this today without the support of NLNet who has provided a grant that allows me to work on swirl full time atm: http://nlnet.nl/project/swirl/
Huge props.
With VLC, we've looked at various P2P video streaming technologies, like Tribler, Goalbit or Peercast. But so far, nothing actually worked correctly...
If something standard (or close enough) actually emerges, and is not insane to integrate, this would change a lot of things :)
It's a VLC branch adding BitTorrent support, which is a pretty slick addition. I've not been able to find their source code (wiki is down) but it looks decent.
This is exciting -- the only similar thing I'm aware of is BitTorrent Live. Seems like the same idea, except I'm pretty sure they have no intention of making BitTorrent Live open source. (The protocol is patented[0], even.) Seems like they've stopped developing it, too.
PPSP is a transport protocol — it transfers a stream of opaque binary data from one location to another. It is unique amongst transport protocols as it is a many-to-many transfer protocol, that is, there is no single master server or endpoint that manages the data transfer.
A swarm is a set of peers that are sharing (receiving and/or transferring) the same data, as a set of small chunks, which is identified by a unique cryptographic recursive hash of the data, called the Root Hash.
This is great, and I'd love using it. What's the (if any) connection with WebRTC? And PPSPP is a pretty unfortunate choice of acronym... A Google search only unearths something about emulating a PlayStation Portable.
We're going to be discussing WebRTC & PPSP together at the IETF meetup next week in London. I see WebRTC as a browser technology atm, so effectively a potential PPSP client and seeder. PPSP's aimed squarely at one-to-many sharing, so maybe there's some cross-over for video conferencing, but I'm not too familiar with it to say for the moment.
This is neat, but I think if peercasting software which is "good enough" can be built on top of WebRTC data connections (mandatory encryption and forced TCP are at least two drawbacks) I think it will win in terms of public adoption if is inferior in other technical aspects simply because people will end up using it without knowing it.
That doesn't include discovery, tracking. It would have to be developed on top. It also necessitates a pretty much full implementation of the WebRTC client signaling, which is not too trivial.
BTW lately they are probably switching to use SCTP for data streaming. Encryption is still mandatory.
Hi, I'm the author of swirl, such as it is. In principle there's nothing stopping you from streaming anything with PPSPP. The current erlang implementation could run on an android tablet for example. And if you had a pcell nearby http://venturebeat.com/2014/02/20/steve-perlman-pcell-is-rea... then b/w really wouldn't be a constraint.
There's a fair bit of work remaining, in particular browser/client support is something I've not even considered, it's not my speciality.
I'm sure VLC and Firefox etc would be awesome clients, but one step at a time.
Please please please let this lead to a P2P version of Twitch.tv.
It would be amazing if gaming broadcasters could stream directly to their subscribers/followers, with very little latency (not the 30 second video lag Twitch adds). Broadcaster monetizing could be tricky, but definitely doable.
The 30 seconds is not because twitch sucks. They do encoding/decoding/sampling
And honestly, if you were streaming a 1080p video over p2p, I see no physical way that it would be below a 30s delay if there was 1k+ people watching it.
You are limited by peers outbound bandwidth and for most in the US that is going to be sub 5MB/s. With that in mind, you can really only have a maximum of 5 or less streams per user. That means to get to the 1000th viewer, its going to be a daisy chain of 50+ users in front of you before you get the data.
P2P does not and can not work for distributed video until home connections catch up. (Unless you are OK with insane delays)
So, lets assume every viewer can broadcast to three others, so the first one reaches 3 that reaches 9 etc. You'd be "behind" 7 others for "up to" 2187 viewers. This is assuming no "super hubs" in between. Or am I missing something? Even if every peer on average is only able to reach two "new" peers, that's still just 10 hops for 1024 users...?
TV has usually about 5-6s delay for live events, add a second or two for HD content. 1 minute delay might be in case of legal issues (where broadcasters are bound by law to add a delay) or some very convoluted video signal distribution.
Delay matters most in case of live betting and live score. There is whole industry around this, I am sure you've heard about bwin.com or bet365.com.
Streaming should be free, as should the protocol it's developed on top of, so anybody can use it, and trust it.
Revenue streams come from supporting the back-end telcos (billing, active traffic mgmt & shaping), and content generators (disney/pixar etc), and a couple of other things I'm not ready to talk about just yet ;-).
Well, if you had a website that cataloged and displayed the streams, much like Twitch.tv, then people can still find streams and subscribe to them in much the same way they do over at Twitch, while the site gets a cut.
Additionally, broadcasters can still do Paypal donation links, like they do over at Twitch.
Advertisements playing before videos, I'm not sure how it could be done, but it probably could be an overlay before the stream starts, or just a HTML5 video that loads first and must finish before the P2P stream begins.
In this case, this is like asking how you would monetize TCP. This specification is a transport protocol and could apply to multiple use cases, not including video.
We've built a fully functional peering solution at Kontiki and have millions of users (all on corporate networks). If you're interested in that sort of stuff come join us - we're hiring.
Your tech sounds cool but I don't think it's open source ;-). If you are at IETF89 in London next week, I'd love to talk P2P streaming. I'm @dch__ on twitter.
P2P has 2 main advantages, both of which are voided by this implementation:
1: Better latency. If Joe talks directly to Sue, that is the fastest connection possible, as opposed to Joe -> Server -> Sue
2: The host saves on bandwidth costs
Now, point #1 is moot because this needs servers to work for live broadcasts with lots of viewers. Also, even if server were not on the mix, you are still going to need your data relayed several times depending on how far from the source you are. You cannot hype away the fact that if a person is streaming a 1080p video at 1mb/s and the average peer has less than 5mb/s outbound, that the delay in getting that video to the thousandth viewer is going to be exponential unless you add in servers anyways.
Point #2 is mostly moot for 2 reasons. The first reason is the cost of bandwidth is almost dirt cheap now. You can get a unmetered 1gb/s line for under $200/month. The second reason is that when you go full P2P, you lose out on a lot of value adds that people like. Things like transcoding for different sizes / mobile as well as DVR style seeking/recording.
Another reason which I am sure will be hotly contested by the developers is that the quality will be more unreliable than a direct server. Last mile Peers are the worst for sending data. You want to typically be in a hub for direct peering to the various end points (comcast/tw/etc.)
In most cases, Its unfortunately the case that P2P causes exponentially more headaches than it solves.
Now, that is not to say that multicast does not have its place in the world. Just that its more of a feature to compliment traditional relay streaming vs being a product on its own.