So many communication platforms are proprietary walled gardens these days. Facebook, Twitter, Slack, Discord, and so on. Their incentives do not align with what's best for their users, or society in general. We absolutely need open solutions become the standard dominant medium for communicating.
If Matrix and Mastodon become as easy to use as their proprietary counterparts, they could undo much of the harm done by our culturally divisive "social" networks.
> they could undo much of the harm done by our culturally divisive "social" networks.
Or it could be worse. Meta/Twitter spend a lot of resource to moderate content for a reason. I don't see how a decentralised model could solve the problem of content moderation.
They spend a lot moderating because their platforms largely allow anybody to spam/harass anybody else with no barriers. I suspect the lack of compartmentalization like Reddit or Mastodon is because of money. There is greater engagement with fewer barriers but more potential for abuse.
Email and default-deny for unknown addresses solves this problem entirely. No reason other protocols couldn't do the same. It's only a problem if you need to host content by any rando, then try to show it to as many people as possible to drive "engagement".
It's worth checking out the approach being considered by Matrix[0] which should make the solutions scalable and thus affordable. Empowering users and communities to distribute and delegate their moderation decisions seems like it will enable more innovation, rather than users being stuck with a "one size fits no one" regime.
No, not at all. If Reddit were to adopt this model, then anyone could act as a moderator for any subreddit, publishing their modding decisions and letting everyone else opt in to applying those decisions to the posts and comments that they see.
I suspect that if someone's modding decisions could be ignored at the click of a button, there would be less temptation for them to abuse their power, and the role of moderator would no longer attract people who want to force their view onto others.
More "select few" the better, enough echo chambers and you are just hearing yourself. Facebook is in the hands of Zuck.
The WWW exists, that allows "select few" to publish and irc/bbs/group chat are echo chambers, but these techa have not done the damage that social media has. They are doing something wrong.
One important aspect of this system needs to be the ability for communities, not other mods or admins, to have the ability to vote to add/remove moderators of their respective communities. Otherwise, power will concentrate amongst the moderators of popular communities which will undoubtedly lead to abuse, as we have seen on Reddit.
1/ Moderators aren't always aligned with the community (see recent /r/antiwork scandal).
2/ Reddit does not pay moderators for the labor they provide. Moderation is basically a job that takes hours of time and they profit off the backs of that free labor.
This is the Mastodon model! Your instance’s admin can enforce whatever rules she desires, and ban entire remote instances for breaking them.
For instance, if my instance has a rule that “nothing related to feet may be posted on No-Feets Friday”, I might decide to block foot.celebration for being a hotbed of constant footposting regardless of the day. If you disagree with my choice, there’s a ton of other instances out there you could move to, or go get foot.party and start your own. I might block these Friday-Foot-Friendly instances as well; if enough of my users decide they want to talk with people on those instances then I might suddenly find myself with everyone leaving. If sentiments are widely split on the subject of Feet On Friday then we might end up with two groups of instances that largely don’t federate with each other over this matter.
My favorite subs that I keep going back to are very lightly moderated. I have rage quit many subs I used to really enjoy, purely because of absurd and unnecessarily heavy handed moderation. In such subs, mod comments literally outnumber on topic ones, most comments get removed, and most threads get closed. But what you see isn’t the clean result of vigilance, it’s all the artifacts of the mod carnage, like heads on pikes by the road. The subject and content of these subs is the mods.
Do you not expect people to self select into the same divisive social networks no matter the communication platform that they are using? How does offering yet another platform for them to do so undo any of the harm already caused?
Social networks most important metric is "engagement" number of minutes spent on the site. And turns out that it is human nature to engage with content that it's "dangerous" so the social networks have optimized to show divisive content.
A free social network, will work to improve the happiness of their users... Which will include measures like limit the amount of hours you want to spend on the site itself.
(Like HN does by the way)
Or like I'm in NYC today, who wants to meet? Let's have a real interaction
This sounds like "people eat too much fast food so we will invent vegetables." Surely better but most people wont care. The hard part is gamifying what is good.
Problem #1 with modern social media is pervasive recommendations. You can't just follow a bunch of people you already know and be done with it. With a few exceptions, the platform itself will incessantly try its damnest to expose you to content from outside of your network because that's what drives metrics.
People can't be harmed by seeing updates from their friends and from communities they deliberately chose to follow. It's very important that every single post you see in your feed is a direct consequence of your informed decision to follow its author.
I don't think there's much hope for people that go out of their way to insulate themselves into echo chambers. The key is removing the economic incentive for corporations to deliberately build these echo chambers and set them against one another. It's not that open federated systems will solve all problems, it's that a number of these problems will cease to be exacerbated. This may sound like a small thing, but at scale, the impact is enormous.
> If Matrix and Mastodon become as easy to use as their proprietary counterparts
The problem is not primarily about usability, but about network effects and economic incentives. First, it's really hard to break the dominance of existing social media platforms, and they will never voluntarily adopt open standards. Second, I fear that if a platform becomes big enough, they have an incentive to close down their standards because, you guessed it, cash money.
Therefore this needs to be regulated. The EU is taking a good first step with the Digital Markets Act, and just a couple of days ago decided that big messaging services need to open up their APIs. Now, this is not the perfect solution, but a step in the right direction.
Slack/Discord are just hosted community platforms, I wouldn't put them in the same vein as Facebook/Twitter. Slack/Discord/IRC communities have intentionally curated walls for specific purposes, just as we do in the real world, I think that is fine.
Would love to hear thoughts to the contrary however. It's worth discussing as this is a big problem for an internet hitting critical mass in participation.
And they never will for the same reason email, IRC, forums and every other form of commnication gets gobbled up and surpassed by private companies.
Open source software isn't driven by a dictator with a better vision - software projects run by good dictators will always win out against "built and design by committee" open source projects.
On the contrary, Matrix and Mastodon give more freedom to their users. Which will mean, unfortunately, more fake news, more echo chambers and in general more decoupling of online reality from the real reality.
If Matrix and Mastodon become as easy to use as their proprietary counterparts, they could undo much of the harm done by our culturally divisive "social" networks.