1) We set up something like the Internet Archive, spidering and caching the network, run using donated bandwidth. The hosting could be centralized like the Archive or p2p itself (donated by peers).
2) A system like FileCoin's where you pay a tiny amount of money to ensure other peers will always host your repos on the network.
3) A straight up centralized, for-profit service where you pay a company a monthly fee to host your data on the network, as people already do with GitHub itself.
Thanks for the correction. I guess you get a probabilistic "guarantee" in that you could see how many peers are serving your content at any given time, and make assumptions about their independence from each other and likelihood that your content will stay available?
1) We set up something like the Internet Archive, spidering and caching the network, run using donated bandwidth. The hosting could be centralized like the Archive or p2p itself (donated by peers).
2) A system like FileCoin's where you pay a tiny amount of money to ensure other peers will always host your repos on the network.
3) A straight up centralized, for-profit service where you pay a company a monthly fee to host your data on the network, as people already do with GitHub itself.