First of all, not all p2p networks operate like Tor. For example bittorrent and ipfs only host content you look at. So hosters could largely self-select the content they replicate.
Secondly, there are several tiers of content. a) stuff that is illegal to host b) stuff that is not illegal but that you find so objectionable that you don't even want to host it c) stuff that you don't like but doesn't bother you too much d) stuff you actually want to look at.
I posit that a) and b) are fairly small fractions and the self-selection mechanism of "things that I looked at" will reduce that fraction even further.
And even if you are on a network where you randomly host content you never looked at encryption can provide you some peace of mind (of the obliviousness kind) because you cannot possibly know or expected to know what content you're hosting. Add onion routing and the person who hosts something can't even be identified. If Viewer A requests something (blinded) through Relay B from Hoster C then B cannot know what they're forwarding and C cannot know what they're hosting. If neither you nor others can know what flows through or is stored on your node it would be difficult to mount pressure against anyone to disconnect.
For the illegal content, especially in oppressive environments, you could install a Voluntary Compliance(tm) government blocklist on public-facing nodes and still opt to run an internal node in your network that uses encrypted connections to retrieve things hosted in other countries you're not supposed to see.
----
Anyway, back to filtering for decentralized content hosting. I think once you have a network it is a matter of managing expectations. You can't make content magically disappear. Platforms like youtube, twitter, facebook etc. have raised the false expectation that you can actually make things go away by appealing to The Authority and it will be forever gone. In reality things continue to exist, they just move into some more remote corners of the net. Once expectations become more aligned with reality again and people know they can only avoid looking at content but not make it non-existent things boil down to being able to filter things out at the local level.
> And even if you are on a network where you randomly host content you never looked at encryption can provide you some peace of mind ... If neither you nor others can know what flows through or is stored on your node it would be difficult to mount pressure against anyone to disconnect.
I think you misunderstand the objection. Yes, encryption can mean you cannot be persecuted for "hosting"/"transmitting" some objectionable stuff, since you can prove that you had no idea (at least that's the theory).
However some want to be able to "vote with their wallets" (well "vote with their bandwidth"). They don't want to assist in the transmission of some content, they want that content to be hard to find, and slow and unreliable. They have the right to freedom of association and don't want to associate with those groups. Encryption cannot guaranatee that I won't help transmit $CONTENT.
> First of all, not all p2p networks operate like Tor. For example bittorrent and ipfs only host content you look at. So hosters could largely self-select the content they replicate.
I'm aware of that, but they you suffer the problem of people wanting deniability.
> Secondly, there are several tiers of content. a) stuff that is illegal to host b) stuff that is not illegal but that you find so objectionable that you don't even want to host it c) stuff that you don't like but doesn't bother you too much d) stuff you actually want to look at. I posit that a) and b) are fairly small fractions and the self-selection mechanism of "things that I looked at" will reduce that fraction even further.
That's true, but those sets pretty much only need to be non-zero for it to threaten peoples willingness to use such a network.
Further, unless there is stuff in a), and stuff that fall into b) for other people that you want to look at, such a network has little value to most of us, even though we might recognise that it is good if such a network exists for the sake of others.
This creates very little incentive for most to actively support such systems unless such systems also deals with content that we are likely to worry about hosting/transmitting.
> For the illegal content, especially in oppressive environments, you could install a Voluntary Compliance(tm) government blocklist on public-facing nodes and still opt to run an internal node in your network that uses encrypted connections to retrieve things hosted in other countries you're not supposed to see.
That's an interesting thought. Turning the tables, and saying "just tell us what to block". That's the type of ideas that I think it is necessary to explore. It needs to be extremely trouble-free to run these types of things, because to most the tangible value of accessing censored content is small, and the intangible value of supporting liberty is too intangible for most.
> Anyway, back to filtering for decentralized content hosting. I think once you have a network it is a matter of managing expectations. You can't make content magically disappear. Platforms like youtube, twitter, facebook etc. have raised the false expectation that you can actually make things go away by appealing to The Authority and it will be forever gone. In reality things continue to exist, they just move into some more remote corners of the net. Once expectations become more aligned with reality again and people know they can only avoid looking at content but not make it non-existent things boil down to being able to filter things out at the local level.
This, on the other hand, I fear is a generational thing. As in, I think it will take at least a generation or two, probably more. The web has been around for a generation now, and in many respects the expectations have gone the other way - people have increasingly come to be aware of censorship as something possible, and are largely not aware of the extent of the darker corners of the net.
Centralisation and monitoring appears to be of little concern to most regular people. People increasingly opt for renting access to content collection where there is no guarantee content will stay around instead of ensuring they own a copy, and so keep making themselves more vulnerable, because to most censorship is something that happens to other people.
And this both means that most people see little reason to care about a fix to this problem and have an attitude that give them little reason to be supportive of a decentralised solution that suddenly raises new issues to them.
Note that I strongly believe we need to work on decentralised solutions. But I worry that no such solution will gain much traction unless we deal with the above issues in ways that removes the friction for people of worrying about legality and morality, and that provides very tangible benefits that gives them a reason to want it even if they don't perceive a strong need on their own.
E.g. Bittorrent gained the traction it has in two ways: through copyright infringement and separately by promising a lower cost way of distributing large legitimate content fast enough. We need that kind of thinking for other types of decentralised content: At least one major feature that is morally inoffensive and legal that attracts people who don't care if Facebook tracks them or Youtube bans a video or ten, to build the userbase where sufficient pools of people can form for various type of content to be maintained in a decentralised but "filtered" manner. Not least because a lot of moral concerns disappear when people feel they have a justification for ignoring them ("it's not that bad, and I need X")
I genuinely believe that getting this type of thing to succeed is more about hacking human psychology than about technical solutions.
Maybe it needs a two-pronged attack - e.g. part of the problem is that the net is very much hubs and spokes, so capacity very much favours centralisation. Maybe what we need is to work on hardware/software that makes meshes more practical - at least on a local basis. Even if you explicitly throw overboard "blind" connection sharing, perhaps you could sell people on boxes that shares their connections in ways that explicitly allows tracking (so they can reliably pass the blame for abuse) to increase reliability and speed, coupled with content-addressed caching on a neighbourhood basis.
Imagine routers that establish VPN to endpoints and bonds your connection with your neighbours, and establishes a local content cache of whitelisted non-offensive sites (to prevent a risk of leaking site preferences in what would likely be tiny little pools).
Give people a reason to talk up solutions that flattens the hub/spoke, and use that as a springboard to start to make decentralisation of the actual hosting more attractive.
Secondly, there are several tiers of content. a) stuff that is illegal to host b) stuff that is not illegal but that you find so objectionable that you don't even want to host it c) stuff that you don't like but doesn't bother you too much d) stuff you actually want to look at. I posit that a) and b) are fairly small fractions and the self-selection mechanism of "things that I looked at" will reduce that fraction even further.
And even if you are on a network where you randomly host content you never looked at encryption can provide you some peace of mind (of the obliviousness kind) because you cannot possibly know or expected to know what content you're hosting. Add onion routing and the person who hosts something can't even be identified. If Viewer A requests something (blinded) through Relay B from Hoster C then B cannot know what they're forwarding and C cannot know what they're hosting. If neither you nor others can know what flows through or is stored on your node it would be difficult to mount pressure against anyone to disconnect.
For the illegal content, especially in oppressive environments, you could install a Voluntary Compliance(tm) government blocklist on public-facing nodes and still opt to run an internal node in your network that uses encrypted connections to retrieve things hosted in other countries you're not supposed to see.
----
Anyway, back to filtering for decentralized content hosting. I think once you have a network it is a matter of managing expectations. You can't make content magically disappear. Platforms like youtube, twitter, facebook etc. have raised the false expectation that you can actually make things go away by appealing to The Authority and it will be forever gone. In reality things continue to exist, they just move into some more remote corners of the net. Once expectations become more aligned with reality again and people know they can only avoid looking at content but not make it non-existent things boil down to being able to filter things out at the local level.