Every time I see social media recreated in a decentralised fashion I realise why the successful ones are centralised. The loading times of this are dreadful and I don't see how you'll be able to solve that problem when you're relying on P2P.
In any case, it sounds like we'll be getting many new Reddit clones so I may as well plug what I've built recently: a Reddit alternative API that is free. Check it out if you're a dev that is being caught out by the Reddit pricing changes[1].
Basically the difference between connecting to datacenter-residing apps that internally share data between each other and connecting to 10 random guy's basement PCs.
Everyone who played any P2P multiplayer shooter in their life knows P2P sucks... on the other hand, dedicated server based shooters and MMOs (generally) work great. You don't have to be an IT professional to know this. Similar concepts (speed of light and multiplicative effects of multiple slow connections) apply here.
P2P is only great for uses where latency doesn't matter. See: Torrents.
That's too broad of a generalization. p2p is AWESOME for latency since every server added adds latency.
What you are talking about are probably p2p networks with multiple users (based on your shooter reference) and the problem is of home connections and bandwidth since multiple users that all need to connect to everyone doesn't scale.
So for your example:
One on ones over p2p are optimal, but p2p matches with more than 8 players are suboptimal (I pulled the number out of my ass it depends on the protocol and usecase)
It is both decentralized and distributed. It's not P2P though, it's a hub-and-spokes architecture with several interconnected (federated) hubs.
Hubs can have decent uptime and thick internet connections. Hubs are relatively few so they replicate state quickly between themselves, and then more locally between spokes when they come online.
Most natural structures are more tree-form, like that.
That just depends on how you define "distributed". If it means "running on multiple machines", then even centralized protocols are distributed, since a part of the computation is running on your computer.
In context of protocols like Mastodon, if the end-user devices aren't primary holders of data, then I don't call it distributed. It's just decentralized. I guess that way, "distributed" necessarily implies peer-to-peer.
Mastodon page: requires unblocking JavaScript, takes one second just to load the basic layout (without post contents), then takes four to ELEVEN seconds to fully load
Weird. Anecdotal I think. Maybe something with your instance. Mastodon UI is quick and snappy for me. I migrated away from mastodon.social when the influx from Twitter made that server very slow (it has since scaled up significantly).
With an empty cache, five seconds for the full layout, one further second for images. With a populated cache, four seconds to load everything at once. And also requires unblocking JavaScript for no good reason. Sure, not as bad as twitter.com, but that’s damning with faint praise.
I thought Twitter was horribly slow just because of the insane amount of JS and HTML it uses for displaying some simple text. I've always avoided clicking on Twitter links because it's so ridiculously slow.
Why not a bittorrent style peering mechanism? Popular content will always have pools of people who are consuming it.
I've had an idea in the back of my head for what would make a decentralized video service work, and it plays with the idea of peer driven networks.
So, there a few kinds of people on the network: advertisers (hang in there, I'll explain why), leechers (or consumer), content producers, and seeders.
The prime relationship is between a consumer and a content producer. Every other part of the network is designed to support and benefit their interaction.
First, a content producer creates a piece of content (or steals it, let's be realistic) and publishes it on the network. Attached to the content is a pay-per-view requirement; the producer has elected to require payment to view this content. This money can come from one of two places: first, an advertiser can bid on slots the producer has chosen, second, the consumer can pay the producer directly. The producer doesn't have a choice whether the consumer pays or not, only that payment is required.
Requiring payment for the content is optional, but advisable. You see, the producer has a second order relationship with the seeders. The producer can't possibly feed a million viewers in the first hour. That would be a very difficult task. But what a producer can do is elect to provide a percentage of their required income to seeders. Now there's a financial benefit to seeding a popular video/song. The producer can set their payments to whatever they want. The market may equalize itself.
Finally, we have advertisers, which are the monetary fuel for the fire. Most viewers will tolerate ads and wont want to pay money; so instead they pay time. An advertiser can bid on slots in as precise or general way as they want. If they want to bid on a specific creator's video slots, they can. If they want to bid on a broad category of creators or videos, they can. Etc.
This won't ever be truly decentralized, and there is a whole ecosystem of services external to the network that would be needed for this to function well (payment processors, content collation and moderation -- kind of like how mastodon works, a bidding marketplace for advertisers to interface with the content network, a front end for seeders to select and re-seed, and so on).
I don't have the technical hutzpah to build out something like this. I wish I did. Is there someone out there who wants to be a technical mentor? I'd be game.
--
[edit] There should be more and less granularity in how different parties engage. Some people want to pay 10 bucks a month (Amazon Prime) and forget about the rest. Some people want to pay per view. Advertisers might want to target specific creator's channels or whole segments of created content. Creators might want to be able to never even think about ads and just check a box that says "put an ad in front of every video I upload" and then forget it. Or a creator might want to put ads in videos longer than a certain length. And so on. What I've specified above is loosely conceived and would need a lot of thought to turn it into something useful.
Decentralization is a meme, the great majority of consumers don't care. People used Morpheus because it was an easy way to watch (free) movies, not because it was decentralized. Now they use Netflix because it's an easy (and relatively cheap) way to watch movies, not because it's centralized.
Besides, Matrix already has p2p chat, IRC has always been a thing, and so on, heck even BitTorrent is technically still alive. (De)centralization is not a feature, nor is it a bug, it's simply irrelevant.
Hopefully, for the consumer it would be no different than YouTube or Netflix. Unless you don't want ads, then you pay your dues.
I have no idea how the experience would vary for the advertisers.
For the producer it should offer more flexibility in how they get paid. This is the major selling point I think. The ability to put out a high effort video -- say you made a full length movie -- and be able to charge theater prices (17.50USD, or close to it) and make sure the ads hit at the appropriate times? It might be worth it for larger creators to swap over.
Wow, as a consumer I am really sold on this model! You tell me I will pay theater prices and get ad interruptions throughout the show? This is fantastic, sign me up. /s
I apologize. I guess I miscommunicated something. I mean that a creator can opt to charge what they want and also turn off ads if they choose. Or front load the ads, like theaters do. Really, it is up to the creator how they want to format their content.
How does your non official API work? Is it using the official API and caching, redirecting to the website, and transforming it into json? I'm just curious.
That doesn't sound like it's going to be a stable API? I'm not sure there is much you can do though.
Edit: yea after looking at this, it definitely appears to violate the TOS. Also this seems like a huge red flag just begging reddit to shut it down: "Unlike with the Reddit API you do not need to authenticate using OAuth."
IANAL but my understanding is that violating the ToS is only ever a problem if your crawler decides to "sign-in", as that constitutes agreeing to the terms of the ToS.
Further, old.reddit.com doesn't gate any content (even NSFW stuff) behind a sign-in page (at least for now)
> IANAL but my understanding is that violating the ToS is only ever a problem if your crawler decides to "sign-in"
Where on earth would you get that impression?
From the reddit ToS: "To use certain features of our Services, you may be required to create a Reddit account (an “Account”) and provide us with a username, password, and certain other information about yourself as set forth in the Privacy Policy."
NB: "Certain features", that clearly means that features are not gated by sign in, and that the ToS also applies to them.
Also from the Tos: "
Except and solely to the extent such a restriction is impermissible under applicable law, you may not, without our written agreement:
license, sell, transfer, assign, distribute, host, or otherwise commercially exploit the Services or Content;
modify, prepare derivative works of, disassemble, decompile, or reverse engineer any part of the Services or Content; or
access the Services or Content in order to build a similar or competitive website, product, or service, except as permitted under the Reddit API Terms of Use."
And later: "Access, search, or collect data from the Services by any means (automated or otherwise) except as permitted in these Terms or in a separate agreement with Reddit (we conditionally grant permission to crawl the Services in accordance with the parameters set forth in our robots.txt file, but scraping the Services without Reddit’s prior consent is prohibited)"
My point wasn't that scraping reddit is not a violation of the ToS, it was that you're not able to legally enforce the terms of the ToS (that you have quoted in your reply) on people who haven't agreed/consented to them (which they do by logging in).
Even if I were to agree with your interpretation (and I absolutely do not), this is still plain jane mass copyright infringement. The submitters have given reddit a sublicense to publish their content, not random 3rd parties.
> The submitters have given reddit a sublicense to publish their content, not random 3rd parties.
This is tangential no? Third party reddit apps _already_ republish end user content.
I'm not saying third party apps powered by scraping are not illegal, I'm saying they're _no more_ illegal than those powered by the official reddit API.
Question: if someone, using the internet, gains access to your personal computer, but doesn't add anything or delete anything, have they committed a crime?
Interesting. I just did a quick read. The CFAA protects computer data and is limited to data in which the federal government has a legal interest; data of financial institutions; and perhaps some additional specifically enumerated parties. Its reach is further limited to the theft and subsequent use of data that causes specified types of harm.
So, while it almost certainly would not apply to our personal computers, I think it would probably apply to most commercial companies (provided they were large enough to constitute commercial commerce and provided data was used in a manner explicitly enumerated by the statute)
Impressive feat! Does reddit have rate limiting, or other hurdles in place similar to the hoops youtube-dl has to jump through? Curious what your thoughts are about maintaining a project like that.
As history has shown, you can only do so much to stop this. If you perfectly mimic the GoogleBot and use google IP ranges by hosting on google cloud, they either take an SEO hit or let you bot them at the end of the day. GoogleBot looks like a DDoS attack a lot of the time too
You can also go the route of looking like a pool of users, then it's just a game of cat and mouse and one providers don't really have time to play
Not at the minute. I might open source it in the future, especially if I get a cease and desist. Though hopefully even if it comes to that someone like the EFF will help me fight it.
When visiting the link, chrome just suggested I may mean to visit "reddit", never seen that before. I have safe browsing disabled, so I'm at a lost from where this comes from.
https://i.imgur.com/l57osTi.png
For the GET /r/:subreddit/comments/:article endpoint, is there a limit to the amount of comments returned? If yes, how is it determined which get returned? Sorted by top? Best?
Yeah, currently the limit is 50 comments. I plan to implement the ability to choose a sort and limit, assuming the API actually gets some usage. Right now the sort is whatever the new.reddit.com default is.
Reddit may not be a public website for long. They push their app and ask for logging in so often, it wouldn’t surprise me if they shut off parts of the site for not logged in users.
They can still ban your service, e.g. block your IPs. Sure, you could then play a cat and mouse game with them, but your API clone going down every few days will make it unusable for anyone with a real purpose. The companies that reddit (officially) wants to target, e.g. OpenAI et al., have no problem scraping.
I don't think technology is the problem here. Rather, it is the evergreen gullibility of humans, showing their willingness to trust their communities and memories and volunteerism to corporations which then seize control of everything.
We had a P2P and open source Reddit alternative 40 years ago, called Usenet. But people fell for a centralized one run by a company once it came along, like they always do. May Reddit's fate end up looking something like Digg's and AOL's.
I agree with your main point: tech isn't the problem here.
But a minor quibble: USENET wasn't exactly peer-to-peer. I mostly got to USENET through my university or ISP's NNTP servers. If I wanted to start a new group and get you involved, I would've had to set up my own NNTP server for you to connect to, using either your NNTP client or your own NNTP server.
A lot of people seem to believe that centralization is the problem, and ignore Wikipedia, which has been true to its mission far longer than Reddit or Twitter have, and it never needed a decentralized model. The simple explanation is that it's the profit motive that is the problem. That can be fixed with the proper organizational structure.
I wish the Mozilla Foundation would stand up Reddit and Twitter clones. I bet they would be more usable on Day 1 than the Fediverse solutions.
I couldn't agree more - the profit motive brings a whole bundle of perverse incentives.
Well-managed centralization does allow for governance, where things like moderation and technology decisions are difficult to leave up users. It's sad what happened to the public square of Twitter. There really isn't a viable alternative at the moment which has the reputation of a neutral ground where people can exchange discourse with relatively even chance of being heard, for all its problems.
I spent a lot of time on Wikipedia back in the day. It is monstrously screwed up too, less by the profit motive per se than by innate bureaucratic urges, and (in the Wikimedia Foundation's case) the desire to control a bunch of money and direct it to one's pet projects, which is the next best thing to getting one's own hands on the money. Besides money, the other corrupting substance is Google search rank, which leads to a single article repo that everyone wants to control. The search rank attractor stops the editing community from splitting off. The profit motive is a particular form of status seeking, and one of very potent influence, but there are other forms too, and Wikipedia has those a-plenty.
Wikipedia's original vision was 1) to free the content and the communities for forking as well as copying, and 2) to give everyone in the world a free encyclopedia in his or her own language. I have a big rant about that but the quick version is that Wikipedia.org (the web site, not the project) should have been designed like Github, so you would use it for editing and creating forks, but (mostly) not for reading articles. Instead, whenever you buy a new consumer PC, through deals set up by the WMF, Wikipedia in every language should already be on its hard drive (you can delete it if you want the space instead). You'd read articles locally instead of through a privacy invading web site and internet surveillance apparatus, you'd get updates through a nightly equivalent of git pull that got all the changes with no indication of what you had been reading, you could downstream your copy to your friends or followers, you could make a fork and downstream that if you don't like Wikipedia's policies, etc.
I agree with you that Wikipedia doesn't have the exact same problems as Reddit. It is arguably not as bad as the current version of Reddit (we're here right now because Reddit is about to get much worse), but it has its own problems, spreads propaganda that is taken more seriously than Reddit posts are, etc.
Come to think of it, we also had a perfectly good distributed, peer to peer, FOSS source control system called Git, but it got mostly supplanted by a centralized "hub" now run by Microsoft, the original closed source empire. It never stops.
And to get there first: please don't bother saying that forking Wikipedia is permitted under its licensing, since that is near irrelevant in practice. It could and should be made simple and easy. Wikipedia's main value these days seems to be AI training rather than informing humans, which was supposed to be the point of an encyclopedia.
I love the idea of built-in offline Wikipedia, huge plus.
Coming back to your first point... how often would you maintain a fork of your own encyclopedia? Even with Git, you would fork mostly in order to submit a PR to the original repo. I'm just trying to understand the use-case here since I cannot imagine a situation where I would fork Wikipedia and make my edits to be shared with my friends only and my friends would do the same.
Did you mean something like fandom.com where communities (not individuals) can start their own wikis? I'd also love that.
That would have been a good space for experimentation if the Wikipedia monoculture hadn't stopped it from happening. One idea is you could mark specific articles or categories as forked, so that your changes in those wouldn't get pushed upstream. Overall it might be something like the Linux kernel, where there are tons of forks out there for particular device families or whatever.
As a Wikipedia example, look up Joy Milne (the lady who can smell Parkinson's disease) with a web search, then look her up on Wikipedia. Wikipedia says nothing or almost nothing. That's not an oversight, that's Wikipedia bureaucracy in action. You might be able to get an edit through by battling the bureaucracy for long enough, but if you have an interest in this illness and wanted to gather and share info about it, it would be simpler to just fork the Parkinson's article. Et cetera.
Perhaps now is the right time to revamp Usenet to bring it up to par with the Reddit of today. I am also thinking that email / mail lists, forums, and even old protocols like finger and gopher are all worthy of upgrades, and these were all small parts of a true proven p2p, decentralized, self-hosted social networking environment and may even enjoy a new popularity, if developers work on modernizing them.
Agree tech isn't the problem per say, but "fell for" implies there wasn't a value proposition. The reason each successive platform became successful was because it did something better than it's predecessors and at the right time.
It's weird to use the name "plebbit" considering that's the derogatory term for Reddit used by 4chan folk. It's like begging for this to be a dumping ground for some of the worst human behaviors.
Does the decentralized nature mean that illegal content can't be easily be removed?
You'll also notice that the logo makes the face from the "NPC" memes that would have still been popular among the 4chan "politically incorrect" crowd right around the time this project was allegedly started. Sadly, for enterprising edgelords, Elon has usurped them all by turning Twitter itself into the ultimate "alt-tech" app. Like, seriously, unless you want to share illegal pornography, why not just share your spicey maymays on Twitter where they'll get more reach? Impressive amount of follow through regardless.
> It's weird to use the name "plebbit" considering that's the derogatory term for Reddit used by 4chan folk. It's like begging for this to be a dumping ground for some of the worst human behaviors.
i think thats the obvious intention. Just scroll down the page to see what the user base is.
I recommend the parent and all sibling commentators carefully revaluate their biases in light of this project as a singular topic. It will greatly benefit the understanding of context (somewhat unintuitively -- the terminology is adopted as lingua franca from a reactionary anonymous amalgam, not necessarily tied to the original intent or event).
If the initial reaction from multiple people is “this is a name that has some pretty bad implications” then maybe, just maybe it’s the name that’s the problem, not everyone reacting to it.
This would have easily been avoided by… choosing a name that doesn’t have such a storied past.
the name will set a vibe that attracts the specific crowd. since its a new site that initial user base will set the tone of what the web site will shape into.
People are arguing about the contents, which is kind of shallow, it can be always managed in some way later. I’m more interested in the technicality, what’s the protocol, the architecture, how file are saved and served, encrypted? How about the minimum X to run the service, is it P2P as in all clients serve or only select few nodes? Who manages and operates these nodes? Is it a honeypot? Among others.
Regardless, It’s always good to see new concepts or alternatives sites, just like old days with forums.
There is a disdain for federated protocols but I hardly see how this is any different except that there are no explicit federation features.
The "subplebbit" owner acts like the operator of a Mastodon or Matrix instance except as mentioned above, those were designed to aggregate data across instances.
The fundamental problem with these decentralised platforms is that they want to get rid of the human element but if you just let anyone get away with anything and everyone rehosts everyone else's data unconditionally, then you get something like zeronet which has its own Reddit clone. The problem with zeronet is that people advertise their illegal porn on your supposedly "family friendly" Reddit clone and your users involuntarily download it and make themselves criminally liable. Whether something should be legal or not is a philosophical question, someone has to make a decision and then enforce it. Any platform where users are allowed to upload arbitrary information requires moderation. There is no way around this. The only benefit of a decentralized app is that you can have smaller self hosted communities.
I spent a lot of time evaluating the technical viability of decentralised apps and honestly you should avoid it even if your use case requires it.
>The "subplebbit" owner acts like the operator of a Mastodon or Matrix instance
- Mastodon and lemmy instances can delete your account/community data (they own your data), on plebbit you own your data, it's on your device and you seed it P2P, community owners cant delete it
- Mastodon and lemmy instances can block you from accessing and interacting with other instances through their client, which forces you to use multiple clients to interact with all instances you want, on plebbit, the client accesses all content directly and p2p, community owners can't block you from accessing any content. You only have to use a single client.
- Mastodon and lemmy instances require a domain name, a static, public HTTP endpoint, an SSL certificate, DDOS protection to run. All which are complicated to set up, cost money, sometimes require KYC, sometimes require the command line and linux. Your server, domain, ssl and ddos protection provider are intermediaries that delete/block your account.
Whereas plebbit is a GUI executable like a bittorrent client, you download it, open it, that's it, you're done. No payments, no kyc, no command line, no config, no intermediaries that can shut down your account.
Ultimately a young site lives and dies on community. So I understand why. The average person looking for a site to talk on isn't concerned about the tech stack underneath. just that the experience is smooth.
Is there any settings to not make it look like new reddit? If I was building a new site I wouldn't copy the godawful empty space design they forced on everyone
Thanks for the info! I'm intrigued by the different interfaces for the same content. I'll definitely keep an eye on this when reddit dies in a couple weeks!
Well, the FAQ at [1] quotes that a different UX is not a goal:
> The goal is to recreate Reddit exactly, the exact same UX, except without servers, DNS, global admins, global moderators, lawyers, regulations, corporate greed, etc.
I don't think full P2P or Lemmy-style federation is the right way to make a decentralised Reddit alternative. P2P systems aren't great for mobile devices that can't/shouldn't be full peers, not to mention moderation issues. And ActivityPub-style federation where accounts belong to one instance and content is copied across instances has many problems, which a Reddit alternative needn't have.
Reddit content is already perfectly stratified into subreddits, so I think anything that's primarily a decentralised Reddit alternative should use the same principles. Instances should host subreddits with floating user identities, and content from one instance should always stay on that instance. This solves things like moderation (it basically stays the same as on Reddit, but you get to choose your instance) and legal content attribution. In a way, it's a continuation of the "independent web forum" model that Reddit supplanted.
What you're describing is actually very much like Lemmy. Communities are on a specific server, you just use federation to post from your account (instead of creating a local account every time) and as not much more than a frontend. Your local instance just provides your customized browsing interface and identity and talks to the server hosting the community over the API.
I'll admit I don't know much about the specifics of Lemmy, so I assumed it has the same issues as Mastodon. There's still the issue of defederation? And notifications (new post/reply) are sent to the home instance of every subscribed user and include content, right?
I think it would be better to send only small "something's new" push notifications, having the client reach out for any data or even metadata. Also, having user identities be cryptographic keys could solve the defederation problem, but I understand that the user experience would be very atypical. Restricting moderation drama to one particular instance is to me a very important feature of decentralised systems, and the possibility of being excluded from the larger community over such drama is a serious fault, even if that's probably less likely with Lemmy.
I don't know why but this somehow taxes my CPU more than the official Reddit site, which is kind of impressive in a way.
The theme itself is a good Reddit replica, but I don't think IPFS has the necessary performance to run apps like these. Maybe it can live become a Lemmy frontend for the FOSS/fediverse crowd?
The number of third party domains this site needs to hit to even load any posts on the front page is insane. I'm looking at a list of domains like "etherscan", ipfs gateways, pokt... all kinds of shady sounding bullshit and it isn't at all clear to me what I would need to whitelist to get this site to work: http://0x0.st/Hbkj.jpg
Yes I know this is "my fault" for being a weirdo who doesn't blindly let his browser hit any server a website pleases. But I use this to judge the quality of a website, the care and professionalism that was put into it. This plebbit site is awful by this metric even compared to new reddit; with new reddit merely whitelisting *.reddit.com is sufficient to view a list of posts, and whitelisting 3 more domains (all clearly reddit-affiliated: redd.it, redditmedia.com and redditstatic.com) is enough to make the site behave as it should.
plebbit doesn't require you to sync an entire ledger, it uses content addressing (IPFS), which lets you only download and seed the content you use, not the entire network. A plebbit "full node" is just an IPFS node (similar to a bittorrent client), it takes no space and has no sync time.
its just a bunch of IPFS and ETH gateways. I haven't tried it myself, but I think there's a desktop application you can install that connects to IPFS natively. I don't know if they're running their own ETH node, but probably not.
I use noscript and while normally I’d agree with you about the number of reqs to 3rd party domains being an inverse indicator of quality, I think in this case it just means the client application is marshalling its own connections to broader p2p system without a fancy experience api layer.
Musk claims the only solution to bots in 2023 is paid accounts.
I kinda agree with him. There is no captcha that's going to keep out determined spammers. Even reddit is filled with bots.
You either try to eliminate bots through paid accounts, real ID verification... Or you filter based on quality of content (ie. bots are allowed to post, as long as what they say and do is indistinguishable from a human - like reddit does).
Its the same reason Discord/Telegram kind of works, the same reason small niche subreddits (and Hacker News) are still OK, the same reason dedicated game servers are so much nicer than public matchmaking in video games... They have a hierarchy of users who are motivated and empowered to police their own reasonably sized niche space.
Twitter can't do this, but maybe Twitter should not exist as it does now.
Reddit the company wouldn't really like this either, as they would have rather have mainstream mega-subreddits over collectively smaller niche subs.
I think explicit & transitive trust is a strategy worth exploring. If you trust accounts made by people that you know personally, and you transitively trust the people that they trust (say... five hops). That's going to be a significant portion of humanity.
If you start getting a little spam, you explicitly revoke trust in the spammers. If you get a lot of spam, revoke trust in whoever you trusted that trusted the spammers, etc. A little bit of social graph hygiene ought to go a long way against spam, so long as we're trusting people and not platforms. We're practiced at one of those and not the other.
the community owner must configure some challenge exchange between his node and the users publishing to his node. the challenge is completely arbitrary, it can be a whitelist, a password, a super complex interactive game, sending a picture of your passport, account age, minimum karma, a combination of all these things, etc. Since it is arbitrary, it will evolve with time. Any method (already invented or invented in the future) that centralized platforms use to filter AI spam can also be used, but over P2P.
I think a solution to bot accounts could be a referal system. Everyone who signs up needs to be referred by another person. If the new user has negative impact on the platform, all parent users on the referal chain lose some sort of trust.
Users can set a "trust" threshold and all content they see is filtered through that lens. It's like an automatic twitter block based on an algorithm.
Also people would ask the people they refer to the platform to please not do bad things so they can preserve their "trust" rating.
> What's stopping a determined spammer paying for accounts?
Paid accounts adds a significant overhead to the spamming.
Spammers just don't spam for the sake of it, they want to sell people something, legal or not.
It's one thing to buy millions of spam accounts for $100 on the dark web. It's another to have to spend $5 bucks a month per account on millions of accounts...
When was it ever true that public discussion didn't require moderation? We live alongside a pile of mush brained, mouth breathing dullards whose single and only goal in life is for somebody to vote up their latest brain fart and be told what to think.
Largely we excluded these people from conversation with social cues and sold them back their prejudices via red-top newspapers, which mostly kept their brain farts and prejudices suppressed.
For better or worse, you can't do that with the internet. Facebook and twitter seem hell bent on not only encouraging their appalling behaviour but amplifying it.
The only way to stay away from them is to stay ahead of them and stay away from any medium these idiots think is "their place".
They do vote and they should vote. That to me is quite different from deliberately inflaming them to other actions.
edit: what I mean by that is that facebook, twitter and reddit and 4chan should be held directly responsible for the violence they foster. "we're just a communication mechanism" doesn't cut it anymore. Their promotion algorithms are very different from being a telephone company
4chan is a healthy community. No karma, no "flagged" posts and none of the bullshit you see here. HN is objetively inferior in every measurable way, and I've been using both for >10 years.
I am not sure why you went for the new Reddit UX which is both horrible and hated instead of the more "simple" text only UI. This had also the side effect that you managed to do it worse (probably by combining a bunch of React components) and the whole thing is slow and non-responsive.
>Alas, I don't think it's yet ready for primetime. Is it fixable in a week?
We are 4 devs working full time on it. In my estimate it should be pretty smooth in around 1 year. We should also have an old.reddit interface as well for people who hate the new interface.
I think decentralizing and p2p is the wrong approach to solving the issue with commercialized social networks. What we really need is a non-profit & transparent & independent governance setup. Financing should happen through donations, maybe grants but certainly not ads. A bit like Wikipedia mixed with Mozilla, but even more transparent.
we need centralized communications even more than we need centralized file sharing. while we are at it, centralize the whole internet. heck, centralize and regulate every form of human interaction, digital or otherwise
The site is definitely slow, but not so much that it is intolerable (at least in my experience).
It's a interesting idea but I think 4chan, as the internet watering hole, has the core audience captured already and has done so for the last 15ish years. I am curious as to what the marginal value is here.
I think to have any success, you need to let the p2p features take a backseat, and build a lot of centralised content moderation tooling. You can plug the p2p bit into the centralized bit with a config URL for "List of allowed moderators" and have it default to a bunch of trusted people, who effectively have whitelist/blacklist/supervote abilities.
Then you need to populate it with a bunch of good content to start off with.
Hate to be that guy, but why opt for the new reddit look? This is already coming out of the gate on the wrong foot. The UI is slow and has so much unused white space.
Old reddit, while nice and textual is server side rendered. If you are dealing with P2P not fetching everything in one go (and thus suffering some async performance issues) is better than trying to render it all at once.
That's part of the "why of new.reddit"... which makes sense for some aspects of its infrastructure. Yes, I prefer old.reddit because it renders everything at once and the layout is simpler. But if you're waiting on the network more than the CPU, new.reddit style is probably better.
The UI being slow is because loading stuff from cloudflare-ipfs.com is slow. Trying to get it all at once for a 30 card rendering would be even slower if you're waiting for ipfs.
Old reddit had a compact design that filled the page. New reddit and plebbit both have huge margins on the side, tons of white space between boxes, etc. It's got nothing to do with server-side rendering.
This might be a rather obvious and stupid suggestion, but why don't all the 3rd party apps just get together, agree on a "new reddit" and move to that. If their users are more loyal to the app than to reddit, problem solved.
It just can't be that hard to hack together a nested newsgroup-like system in this day and age.
What does it cost to stand up that server? Establish the core group needed to moderate the site to not have it turn into eternal spam or content that they'd rather not serve site? Pay the lawyers needed to make sure they've got NSFW and copyright take down requests handled?
It's not impossible... but the "bunch of people start up a new site" misses a lot of the technical and social aspects of barn raising ( http://meatballwiki.org/wiki/BarnRaising ).
The advantage of reddit for the backend is:
> BarnRaising is part of the difference between SlashDot and wiki. With SlashDot the barn is already raised - the OpeningStatement already written - before you start, and everyone just sits around bitchin' about it.
7 billion averages to about 2300/sec and most of that is reads. I also looked at the requests appolo makes and it polls the inbox a lot. The only maybe difficult part is paying for storage. But this isn't large scale at that size, scaling is solved anyway.
The technical things being described here put a bottom on the operational costs of standing up the servers ex nihilo.
Yes, you can stand up a server that can respond to 2300r/s and handle 200GB/day (6 TB/month which is another few hundred dollars)... and the storage for it.
The problem with standing this up is the "how do you pay for it in a way that isn't out of the goodness of your own heart" - because part of the reason people are using these is to avoid paying Reddit or see advertisements.
You will also need someone to answer take down requests for whatever legal reason exists. If you do not handle the takedown and leave it to be unmoderated, there are a number of other applications that serve as an example for the community that forms there.
The difficult part is not paying for compute, egress, or storage. The difficult part is figuring out how to get the customers to pay for it and to pay for the staff needed to maintain server and legal availability.
Writing an app that acts as a conduit for data is quite a different than than being responsible for hosting a service that provides that data.
Fair enough, it's not a trivial undertaking and I suspect reddit knows that as part of their shakedown strategy.
I still feel like the various protests going on indicate the space is absolutely ripe for an alternative and being a bad alternative might just be good enough.
notabug.io was significantly better than this. The code is still on github for anyone who wants to try now that there's another wave of reddit exodus. https://github.com/notabugio/notabug
The fediverse is federated (it's in the name). This is distributed (P2P). They're fundamentally different concepts in the underlying protocol so compatibility is a bit odd to think about here.
There are certainly overlaps in the two concepts and certain parts of them can be compatible to a small extent: a bunch of people have compared Bluesky Social's AT protocol to SSB despite the former being federated & the latter being p2p. But similarities aside they can't really be compatible at protocol level.
Is it true that Lemmy censors casual profanity like "bitch"? I first heard of Lemmy a few days ago, in the context of somebody dismissing it because it allegedly performs such censorship.
It does. This doesn't make the concept doomed, though, being on the Fediverse it's already interoperable with Kbin, a hybrid microblogging and link aggregation platform.
A capable person with different views could fork Lemmy while keeping compatibility, and that has happened many times with other Fediverse software.
The loading time need to be improved & sped up to under 2 seconds. Ideally a quarter of a second. It’s doable — one thing to do is serve a cached page first.
At the time of writing, the 1st page load is 14 seconds and subsequent page loads are 7 seconds.
This shows why I choose to believe in federation rather than full peer to peer decentralization after years of searching for a good enough peer to peer social media protocol. Not to even mention the content, the site is slow asf, and instead of allowing people to host their own "instances" with multiple communities, they host just the communities with IPFS and pay for ENS domains. This makes the barrier to entry to creating new communities way higher.
Each community owner must run a node 24/7 to seed his own community's content via IPFS. Similar to how you need to seed a torrent you client.
There's a GUI client so it's very easy, you just double click and leave it open.
It's also possible for centralized services to run communities for people, so it's technically possible to run a community without having to have any device run 24/7 yourself.
the challenge exchange between the authors and community owners are arbitrary, they can be anything the community owner can code and that the author client understands, so POW is possible, but I'm not sure it would stop spam properly, a server can do a lot POW for cheap. Maybe combined with another challenge.
Terrible. It's slow, UI is jaggy, everything feels so clunky. I didn't know it's possible to make it worse than current reddit. Apparently I was wrong.
While impressive, I don't think IPFS can scale to any kind of real load that even a fifth of the users of reddit using this could handle. Its a super heavy daemon and if its just gateway users there is still a little bit of centralization going on. Like who runs "pubsubprovider.xyz" that most of the data is being relayed with? same with "plebpubsub.live"
the idea is that if there's 100 providers eventually, and only 1 needs to not censor you to get your content, then it's unlikely you will be censored
we plan on adding IPFS webrtc and webtransport at some point, to reduce the need and load on the ipfs gateways, and whatever new that comes out that can help the browser/mobile clients. For example https://iroh.computer/ seems promising
the desktop client is fully P2P, it bundles an IPFS node, and the only requests it makes are through the IPFS node. (technically it makes one request to github for default communities, but we will remove that at some point). Also if your community or username uses a human readable name (optional), then you need to resolve that name somehow, we can technically support any name system anyone makes, and that might or might not be P2P.
That's no excuse for posting bad comments yourself.
A big internet forum like HN gets tons of crap comments. Most eventually get moderated, but not all; and even the ones that get moderated take time before we see them.
Decentralized does not mean there's no moderation.
4chan is centralized but the content can be...
We don't need big companies to help us do quality moderation. Quite the opposite.
In any case, it sounds like we'll be getting many new Reddit clones so I may as well plug what I've built recently: a Reddit alternative API that is free. Check it out if you're a dev that is being caught out by the Reddit pricing changes[1].
1 - https://api.reddiw.com