True P2P streaming is more trouble than its worth. Take it from a guy who streams petabytes a month and has looked into it more times than most.
P2P has 2 main advantages, both of which are voided by this implementation:
1: Better latency. If Joe talks directly to Sue, that is the fastest connection possible, as opposed to Joe -> Server -> Sue
2: The host saves on bandwidth costs
Now, point #1 is moot because this needs servers to work for live broadcasts with lots of viewers. Also, even if server were not on the mix, you are still going to need your data relayed several times depending on how far from the source you are. You cannot hype away the fact that if a person is streaming a 1080p video at 1mb/s and the average peer has less than 5mb/s outbound, that the delay in getting that video to the thousandth viewer is going to be exponential unless you add in servers anyways.
Point #2 is mostly moot for 2 reasons. The first reason is the cost of bandwidth is almost dirt cheap now. You can get a unmetered 1gb/s line for under $200/month.
The second reason is that when you go full P2P, you lose out on a lot of value adds that people like. Things like transcoding for different sizes / mobile as well as DVR style seeking/recording.
Another reason which I am sure will be hotly contested by the developers is that the quality will be more unreliable than a direct server. Last mile Peers are the worst for sending data. You want to typically be in a hub for direct peering to the various end points (comcast/tw/etc.)
In most cases, Its unfortunately the case that P2P causes exponentially more headaches than it solves.
Now, that is not to say that multicast does not have its place in the world. Just that its more of a feature to compliment traditional relay streaming vs being a product on its own.
Please read the PPSP protocol more carefully, your statements are invalid.
#1 if Jose is streaming to Sue, there's no middle server here. PPSP doesn't mandate a "central server", and I hope on the swirl website I can make this use case clear too.
I'd love us to be able to stream from our devices to friends etc and not need to store everything on youtube or dropbox or some other indexed/sliced/diced/resold for profit system. With PPSP however you could have your home network be a peer, and pull down the content safe and sound with minimal impact to your phone. This type of sharing is something I would love to roll out for journalists in the field, protesters in Venezuela and Ukraine, where media suppression by the government and confiscation of phones is a sad reality.
#2 Spoken like a true American ;-). This is not the reality for 99% of the internet users. I'm unable to get more than shoddy ADSL, even in a major European city. What about India, China, Africa, South America? Places where the monthly income is not even close to $200. Hopefully that will change over time, but it's not practical for most people today.
I concur that live P2P streaming is tricky, and I'm sure you have plenty of practical experience to back that up.
PPSP allows DVR style seek / record, the way the protocol is laid out makes that almost trivial. Now if your device at home doesn't have that content, you'll need to retrieve that from a peer, and I'd like to see that peer be close to you.
Today, with bit torrent & friends, there's no standard implementation to refer to, so there's no way for an ISP to put an intermediary cache in to support the functionality you want, assuming you're paying a premium for that content, which is what is done with HTTP & friends on a regular basis.
By having a standard protocol in place, multiple implementations & vendors can share the same caching peer technology within a single telco's network. Those peers can be located at edges within the exchange, similar to how major CDNs and google etc do today.
Finally, PPSP is more than just streaming, it's a completely new transport layer protocol for the internet -- like HTTP, we're only just beginning to explore the use cases.
I feel like you are avoiding the glaring problems with P2P streaming for live video.
With that, for your first point, let me give you a scenario and please explain
Joe is on his laptop and has 5mb/s upstream connection. He wants to stream a 1080p video at 1mb/s to 10,000 people.
In the case above, without a relay server (as you suggest),
How delayed will the data be for viewer 10,001, assuming every peer has the same connection profile at Joe ?
Hint - Its going to be minutes, not seconds.
Second question: What happens when several peers in that chain disconnect at once?
Now as far as your second comment, that feels out of place. Bandwidth is bandwidth. If you are on a 100kb/s line in the middle of africa, your preference is going to be a user to server to user streaming environment since it will be reliable and not require other users with little bandwidth available to them to also stream. In fact, using a P2P system will cost them more money, not less. Its going to be inconsistent, unreliable and slower than p2s2p streaming.
good points -- but they're not the scenarios I'm referring to, and they're extremes around a perfectly usable for 90% of use cases & users.
If Joe wants to stream to Barney and a handful of people then his laptop is fine. If he want to push 10k, then obviously he'll need some help, the maths doesn't stack up any other way. Your original example (and my response to it) mentioned a pair of users with a need for low latency. 10k is obviously going to be different, and we both know that. Let's not be disingenuous here.
PPSP-TP (the tracker protocol) doesn't force Joe's laptop to be the seeder for the entire 10k, it can be managed as discrete swarms, and the tracker protocol can segment or insulate the seeding peer from a larger community if required. But as you rightly point out in this case, help is needed. There's a limit on how much can be streamed based simply on packet size & transfer, even though PPSP is much more efficient than existing protocols due to the way it handles hash management (merkle trees and munro hashes) bandwidth is bandwidth, (latency & asymmetric aside).
Again, PPSP is a transport protocol, not just a live video stream. There are applications even in areas where bandwidth is 100Kb/sc or worse.
If you're really interested in this, please read up on the protocol and contribute some much-needed real world feedback to its development. The next IETF session is, as always, publically accessible, https://datatracker.ietf.org/meeting/89/agenda/ppsp/ next Tuesday 16h10 GMT. It would be great to have you join us!
PPSP as such was designed to have shorter init cycles.
So, in principle, it is suitable for smaller assets (federated CDN, that sort of stuff).
By the way I do not understand the "servers vs P2P" story. Why not use servers in a PPSP network?
The point is pulling the content from arbitrary sources (any sources available), not to get rid of datacenters.
For #2 I'm not sure you understood DanBlake's point. Datacenter bandwidth is cheap and reliable while last-mile bandwidth is expensive and unreliable; that's why centralized services almost completely replaced P2P over the last decade. I'm willing to use a slightly more expensive system in exchange for autonomy and privacy, but that argument hasn't been well articulated here.
Things like transcoding for different sizes / mobile as well as DVR style seeking/recording.
Ogg Vorbis was designed with bitrate shedding in mind (drop bits from packets to get a lower bitrate), though I'm not aware of it being used. Are there any bitrate shedding video codecs? If not, one could be developed that allows low-bandwidth clients to request only a percentage of the blocks of a stream and decode a low-resolution/low-quality image therefrom.
P2P has 2 main advantages, both of which are voided by this implementation:
1: Better latency. If Joe talks directly to Sue, that is the fastest connection possible, as opposed to Joe -> Server -> Sue
2: The host saves on bandwidth costs
Now, point #1 is moot because this needs servers to work for live broadcasts with lots of viewers. Also, even if server were not on the mix, you are still going to need your data relayed several times depending on how far from the source you are. You cannot hype away the fact that if a person is streaming a 1080p video at 1mb/s and the average peer has less than 5mb/s outbound, that the delay in getting that video to the thousandth viewer is going to be exponential unless you add in servers anyways.
Point #2 is mostly moot for 2 reasons. The first reason is the cost of bandwidth is almost dirt cheap now. You can get a unmetered 1gb/s line for under $200/month. The second reason is that when you go full P2P, you lose out on a lot of value adds that people like. Things like transcoding for different sizes / mobile as well as DVR style seeking/recording.
Another reason which I am sure will be hotly contested by the developers is that the quality will be more unreliable than a direct server. Last mile Peers are the worst for sending data. You want to typically be in a hub for direct peering to the various end points (comcast/tw/etc.)
In most cases, Its unfortunately the case that P2P causes exponentially more headaches than it solves.
Now, that is not to say that multicast does not have its place in the world. Just that its more of a feature to compliment traditional relay streaming vs being a product on its own.