It seems that the past 6 years or so saw most big ISP's dropping USENET support claiming mostly piracy concerns. Was it piracy or the fact that it's tough for the government to control what people say on USENET?
Old usenet-head here (on it regularly from 1991, first met it 1986) ...
First problem: there's no identity authentication mechanism in NNTP. So spam is a problem, forged moderation headers are a problem, general abuse is a problem. (A modern syndicated forum system with OAuth or some successor model would be a lot easier to ride herd on.)
Second problem: storage demands expand faster than the user base. Because it's a flood-fill store-and-forward system, each server node tries to replicate the entire feed. Consequently news admins tended to put a short expiry on posts in binary groups so they'd be deleted fairly promptly ... but if you do that, the lusers can't find what they're looking for so they ask their friends to repost the bloody things, ad nauseam.
Third problem: etiquette. Yeah, yeah, I am coming over all elitist here, but the original usenet mindset was exactly that. These days we're used to being overrun by everyone who can use a point-and-drool interface on their phone to look at Facebook, but back in September 1992 it was a real shock to the system when usenet was suddenly gatewayed onto AOL, I can tell you. Previously usenet more or less got along because the users were university staff and students (who could be held accountable to some extent) and computer industry folks. Thereafter, well, a lot of the worse aspects of 4chan and Reddit were pioneered on usenet. (Want to know why folks hero-worshipped Larry Wall before he wrote Perl? Because he wrote this thing called rn(1). Which had killfiles.) Anyway, a side-effect of this was that when web browsers began to show up, the response was to double-down on the high-powered CURSES-based or pure command-line clients rather than to try and figure out how to put an easy-to-use interface on top of a news spool. Upshot: usenet clients remained rooted in the early 1990s at best.
These days much of the functionality of usenet (minus the binaries) is provided by Reddit. Usenet itself turned into a half-assed space-hogging brain dead file sharing network. And we know what ISPs think of space-hogging half-assed stuff that doesn't make them money and risks getting them sued.
> First problem: there's no identity authentication mechanism in NNTP.
Yup. Which is weird, since every message is sent by someone at a host; it should have been possible to simply use signatures to prove which site generated a message—and punish sites which didn't police their users. But crypto was hard (and illegal to export, once upon a time).
> Because it's a flood-fill store-and-forward system, each server node tries to replicate the entire feed.
The upshot of this is that reading news was fast. So fast that folks these days can't believe what a user-friendly experience reading from a local news feed can be. Imagine reading a website where pages come up in milliseconds—that's Usenet on a local feed.
> Consequently news admins tended to put a short expiry on posts in binary groups
Frankly, binaries were Usenet's downfall. Had those been eliminated, then I hope that Usenet would be a lot healthier today. I couldn't get numbers on the size of a daily text feed nowadays, but I imagine it's pretty manageable.
> Upshot: usenet clients remained rooted in the early 1990s at best.
Back in the 90s I used a pretty nice Mac GUI client.
But really, it doesn't get better than gnus…
Anyway, Usenet's not dead—it's still alive, people are still posting and some groups are doing pretty well.
They world could use a new Usenet, with the lessons learned from the first one: site-to-site; non-commercial; anti-spam measures built in.
Imagine reading a website where pages come up in milliseconds—that's Usenet on a local feed.
Now that you mention it, I think the web has never reached the speed of Usenet ages ago.
I propose the following law:
"The size of web pages must expand so that, regardless of CPU and bandwidth improvements, a page will load slower than a Usenet post on a 9.6kbps modem in 1993."
Dunno if you've tried a gopher client recently; but it's shocking how much faster than the web it is... (though still holds nothing on a local mailbox or news spool).
I ran a Freenix-competitive Usenet server for a popular ISP in the mid-1990s (by way of bona fides: we were competitive because I hacked a history lookup cache into INN, a concept we apparently co-invented alongside Netcom). Usenet was by far our most expensive and most time-consuming infrastructure.
The reason for that was binaries. The amount of storage we were required to keep online for binaries was staggering. We ended up buying those ridiculous chrome-plated NetApp fileservers to handle the load. The hardware was expensive, but more expensive was the admin overhead: things went wrong with the INN filesystems regularly, and there was nothing you could do to recover from them quickly; simple filesystem errors that were really just an fsck away from repair could mean 4-6 hours of downtime. Which, by the way, tended to happen at night.
File-sharers sprayed multiple copies of huge files across several newsgroups, in little chunks. If any of those chunks went missing, our users screamed bloody murder. ISPs that tried to host no-binaries Usenet became the target of PR campaigns. Hosting discussion groups on Usenet was cheap and easy. Running a competitive full-feed server, on the other hand, required nearly full-time attention from an admin that could do light filesystem hacking.
The result was a death-spiral: as Usenet got more expensive to host, fewer ISPs hosted it; many outsourced to other providers. They could easily have hosted just the discussion groups! Usenet could have been kept alive and decentralized, and maybe even evolved alongside the web. Instead, software and pornography pirates coerced the network into a few centralized providers, who eventually decided not to waste huge amounts of money hosting infrastructure for those kinds of users.
At least in Germany the death had other causes. And I suspect, these causes were also main causes in the US.
Here, almost no servers carried binaries to begin with. Usenet was seen as a text-only medium, because binaries were so huge and mostly illegal.
And it thrived for a time. There were certain niches that I still miss. Legal discussions were fantastic, with many lawyers, prosecutors and judges participating.
What killed it was that there simply wasn't fresh blood. No mindshare, no articles in computer magazines about Usenet, strictly (and sometimes viciously) defended netiquette.
People liked web forums, and I can see why. Threaded discussions seem tedious. No special software.
I even found out that my Usenet-influenced quoting style, answering point by point and paragraph by paragraph is seen by some people who were never socialized in that atmosphere as "hostile". "You're trying to attack every one of my sentences".
What killed it was that there simply wasn't fresh blood.
Looking back from today, there were also no "analytics" and no invasive pingback tracking every time you open a message (and no monetization). Why bother running things if you can't spy on people, analyze their aggregate behavior, then monetize it?
Plus, the rise of HTML-as-email meant half of a group's traffic could be people saying "PLEASE TURN OFF HTML MESSAGES" every time a new person tries to say something. It ended up being counterproductive (as counterproductive as trying to have a meaningful discussion on any front page Reddit article).
Plus, the rise of Outlook defaulting to top-reply for everybody meant entire generations of new Internet and email users had no idea how to properly reply to messages. Now we can't get rid of morons who think top replies are acceptable (sure, let's have a 6 month long email thread with N^2 copies of the message in my email box because every new reply contains every previous reply for no sane reason).
For better or worse (worse), Reddit is the new usenet but with a much narrower reach and full centralization of all control mechanisms, which deteriorates in to tiny, but oh-so-loud, drama every few months.
Every important Usenet server that people used in the US carried binaries. By way of example: you had to, to be ranked. I believe that Germany may have had different dynamics from those of the US, but I do not believe that there was a more important cause of the death of Usenet in the US than binaries.
I think I like web message boards more than Usenet too, but they're only now coming into their own.
Do you have any recommendations for message boards / forums that have a strong community and quality discourse? (The efforts of dang alone disqualify HN from being a representative message board.)
I think I like web message boards more than Usenet too, but they're only now coming into their own.
In what way? From my perspective, web message boards have been steadily eroding. The signal to noise ratio of most discussions has gotten out of control.
For certain topics, mailing lists seem to be where the high S/N ratio currently is, although in some cases that's been true since the '90s. Almost all free-software projects that once used a newsgroup as their primary public discussion venue now have a mailing list, for example. And some professional specialties and academic disciplines have a widely subscribed mailing list for people in the field. The downside there is that it can sometimes be hard to figure out where exactly people in a field talk, since there's no central, browsable index. Sometimes it's a professional society like one of the ACM SIGs running the list, sometimes it's a university, sometimes a nonprofit or a big conference, and sometimes just some random listserv that accreted enough subscribers to become a de facto hub.
> (Want to know why folks hero-worshipped Larry Wall before he wrote Perl? Because he wrote this thing called rn(1). Which had killfiles.)
Perl came into being in 1987, which is before September 1992, so his creation of rn must have had traction/worship before the "Eternal September." I'd hardly say that said event was the reason that people worshipped him for rn, especially since Perl was out for ~5 years at that point.
We needed killfiles well before September began. (Go read the net.legends FAQ.) And Perl wasn't a great language before 4, in 1993, and didn't attain anywhere near peak popularity until 5.
There were a few web interfaces, but apart from the hatred the old Usenet users had for web users, there were technical difficultiers that were never overcome.
The biggest one was probably charsets. Back then it wasn't really possible to detect the proper charset for what the user entered into your text field, as far as I was told.
But it was only so bad because users were unreasonable:
de.*, for example, really frowned upon UTF-8 and demanded Latin-1 or Latin-15. Minimal coding, of course, so if you used the Euro sign you had to use Latin-15, if there was no Euro sign in your posting, you had to use Latin-1.
And there were even extremists who declared everything apart from 7 bit ASCII illegal...
> There were a few web interfaces, but apart from the hatred the old Usenet users had for web users, there were technical difficultiers that were never overcome.
To expand upon this a bit it wasn't all just knee jerk elitism. Some web clients were terrible and did terrible things.
WebTV in particular would add animated gif backgrounds and MIDI music. Since Usenet was a text medium those messages would have the gif and midi encoded as text (UUENCODEd?) and added to the post, with a bunch of raw HTML as well. Other users got huge amounts of HTML and uuencoded "stuff" or they got animated backgrounds and MIDI music - either one was pretty awful.
As alluded to later in the replies here, no discussion website ever tried to be fully decentralized like NNTP was from inception.
It is a super hard problem, and as the declining popularity of NNTP itself illustrates (and the tendency toward mega-centralization on the web like Facebook, too) maybe it was solving the wrong problem for today.
I think the modern centralization tendency is a bad thing.
If I was going to redesign usenet today ...?
1. You want non-repudiability in transactions: a given user owns their posts or cancel messages and can be linked to them. So, use a blockchain-based solution for authentication. (Not bitcoin; a separate non-financialized system.)
2. Instead of local spools consisting of a directory tree and sync'd via flood-fill, implement it as a virtual filesystem, peer-to-peer discovery, locally requested contents are cached, so high-traffic newsgroups will be cached more widely and therefore be faster to load.
3. Anti-spam: a big problem is preventing spammers sprouting new sock-puppet identities to get around already-imposed blocks. Hence non-repudiability. I'm not sure how to go about enforcing limits on sock-puppeting: this is a hard problem. But if the authentication mechanism is blockchain based we might be able to link it to an underlying funding system and thereby impose a monetary cost on posting -- even $0.01 per message should be enough to put a drag-brake on spamming. This was always the weakness of usenet: spammers externalized all their costs.
There doesn't need to be an "interface" for NNTP and especially not a web based one. The NNTP client in the old Mozilla products from the '90s was just fine and threaded messages well.
It's more than that. The design of NNTP is much like POP3 - you download entire threads and messages all at once. It doesn't work all that well on the web compared with say IMAP.
I especially agree with your third point. Any kid getting his first Windows machine was now a computer expert and ran to the (then) only knowledgeable place to get more info, and that was usenet. That turned usenet into reddit without any controls at all. Reddit, at least, does some minimal clean up.
(Background: I'm a Usenet user from the late 80's to early naughties, did outsourcing at netaxs as newsread.com, then ran readnews.com from 2004-1024).
Usenet is still around but mostly for binaries. The market's of pretty stable size, dominated by a few large wholesale players.
My take on what happened with text groups is that the S/N ratio just went to hell. In the 90s the problem was spam, but in the 2000s the problem was too many loudmouths who wanted to hear themselves talk drowning out the useful experts.
Like some of the other folks commenting, I've been pissed as hell at the PHPBB/vbulleting monstrosities. My original plan with readnews was to try to build a great web UI for discussion, but we got distracted by wholesale customers wanting service - and front-end is not my area of expertise.
For folks looking for something modern with promise, the news is good with discourse and a few others coming up. Would love to see something distributed, but if really distributed I suspect we'd see binaries and/or commercial spam and/or people with nothing interesting to say dominate - just like Usenet...
We'll loop detect to end the mutual admiration society, but thanks from you makes my day :) I just tweeted a pointer to "Doing Terrible Things" and suggested people read all of the "Falsehoods Programmers Believe". I want "Falsehoods as a Service" - outsourced paranoia.
And yes, distributed discussions that preserve free speech and allow eliminating crap is definitely moon-shot hard.
distributed discussions that preserve free speech and allow eliminating crap is definitely moon-shot hard.
The way you just defined the problem, it's not moon shot hard; it's a logical impossibility. "Free speech" and "eliminating crap" are complete opposites. You can't have one with the other. If you want a high quality of discussions, some people are going to have to be deprived of their voice to make that happen, either through moderation or selective membership.
What Usenet did well was that it was completely decentralised, had zero cost of engagement (despite 'hundreds, if not thousands of dollars'), and was everywhere.
What Usenet did badly was that there was a complete absence of identity management or access controls, which meant no accountability, which meant widespread abuse; and no intelligence about transmitting messages, which meant that every server had to have a copy of the entire distributed database, which meant it wouldn't scale.
It's a tough problem. You need some way to propagate good messages while penalising bad messages in an environment where you cannot algorithmically determine what good or bad is, or have a single unified view of all messages, all users, or even all servers. And how do you deal with bad actor servers? You know that somewhere, there's a Santor and Ciegel who are trying to game the system in order to spam everyone with the next Green Card Lottery...
I think reddit is the reinvention of USENET. It is mod-heavy and has enough critical mass of users to provide excellent results from its upvoting system. And many subreddits are extremely well maintained with a very high signal to noise ratio.
It even has its equivalent of alt.binaries.pics.* if one is so inclined.
Sort of, voting rearranging the chronological stream of conversation makes it significantly different IMO. There is also the phenomenon of funniest image tending to win the votes, too. (Barring excellent moderation, but Reddit does very little to make moderation easy, or even set goals for moderation.)
That's not great for discussion, but then Reddit was always designed as more of a system of briefly commenting on URLs than actual discussion.
This is only true for the front page and for things like /r/AdviceAnimals. The front page is in and of itself a separate phenomenon than the rest of the subreddits, in my opinion.
For many subreddits, there are truly fantastic discussions that are very relevant to the subreddit topic. /r/askHistorians or /r/askScience, for example, has an extremely high signal to noise ratio.
Few years ago I created http://www.newswebreader.com (still functioning) website which is the web frontend for USENET. It has NNTP server in the background connected to other NNTP servers, and it displays groups, headers and posts similar to three pane Thunderbird.
You can create an account and susbscribe groups, it remembers what messages you read.
Idea was in the end to make frontend to USENET that would look like Stack Overflow, with voting, and your replies would propagate back to USENET.
Already done, it's called reddit. And the main problem with Usenet was its replication architecture and not its identity/authentication.
reddit doesn't have any identity system in place and it has hundreds of millions of users.
reddit improved on Usenet by adding voting, which is something that at least one Usenet client tried to implement (gnus) but which should have been implemented in the architecture itself.
You don't have to penalize bad messages. Just don't link to them. Curation and moderation seem to be higher level problems that don't need to be specifically addressed by underlying storage/transport layers.
ipfs[1] is an interesting project that could be used to develop applications in this area.
I clicked your link. Then tried a few variations of it. Didn't work. It's online now. That means they can't do HA and rolling updates on the cheap despite all the software/hardware to do so. Can't rely on that but they'll pull off:
"This forms a generalized Merkle DAG, a data structure upon which one can build versioned file systems, blockchains, and even a Permanent Web. IPFS combines a distributed hashtable, an incentivized block exchange, and a self-certifying namespace. IPFS has no single point of failure, and nodes do not need to trust each other."
Wouldn't rely on it for production. I'll go back later and check it out for curiosity, though.
See, the thing is, something.com and www.something.com are different DNS records.
As far as dns is concerned, there is nothing special about www. You could say bob.something.com.
There is a cultural expectation (mostly from people who started using the internet after the late '90s) that www.something.com goes to the same place as something.com, but as far as DNS is concerned, the two are completely different records.
(in the late '90s, one of my tasks at my first programming job was to write a patch to mod_vhost_alias to implement company policy... e.g. to make www.ourcustomer.com go to the same place as ourcustomer.com the patch was required because www.ourcustomer.co.uk also needed to go to the same place as ourcustomer.co.uk, so I couldn't just take the rightmost three chunks)
The upshot is that people who have been around longer, and who like to be curmudgeonly about it will often configure www.mydomain.com and mydomain.com to go to different places, because they are different records. (of course, some would say that this is so they have a chance to explain this, and a chance to feel superior to those who need this explained.)
I get that this is how it works and maybe for some reason people like to treat www as any other subdomain and send it somewhere else - but is there any reason beyond not configuring dns to just blackhole www traffic like that site does?
If you are actually trying to understand this phenomena, I suggest checking out the silicon valley lug webpage. it's at http://www.svlug.org - http://svlug.org now has a 'hey stupid' note that redirects after a few seconds. This page was put up after a lot of moaning from some of the older lug members who thought that the normal user expectation that www.something.com and something.com would go to the same place was, well, stupid, and a sign of the sort of person we don't really want or need to communicate with.
Of course, this is the opposite of the bit people in this thread were complaining about. http://www.svlug.org has always been live, it's http://svlug.org that was dark until the youngsters complained.
A well-understood phenomenon in DNS that most admins take care of to ensure users with a reasonable expectation end up in the right place. Further, a common case a admin should account for. Then, there's these admins and their apologists...
And another try has root working but not www. Need I say more lol?
I get that they're different records, www just seems like the one subdomain that everyone expects to be synonymous with the root. Even if that isn't where you want your resources to "live," it's an essentially free way to help people get to your site -- like googel.com redirecting to google, except, as you said, with like 15 years of ingrained user training.
Defining the goals is a key aspect. If re-invention is what we desire than I would like to take a shot at outlining the positive aspects of usenet, as well as the negatives.
Positive:
* Anonymity possible (to an extent)
* Moderation possible (to an extent)
* Caching of desired content at the network edge
* Binary data (though obviously no more yyencode/etc)
* Libre (as in freedom of speech)
* Free (as in beer)
* Useful, if probably illegal, content
* Distributed
The negatives:
* Impersonation/other false claim to identity.
* Spam
* Illegal content (to whom? how to identify? intractable)
* Flame wars
* Difficulty of setting up a 'feed'
I'd like to take a small stab at these various problems.
For identification I would specify the use of public key cryptography; it's the only de-centralized option I know of. OpenPGP with some extensions (IE: ed25519 signing keys) seems to be the obvious choice.
With identification the use of spam filtering technologies can also be resolved. Have users 'file' copies of messages in to several training bins via flags. Flags would be ternary state entities (true/false/null). Liked, On Topic, 'harmful content' (the catch all would be used in a design sense to include any type of illegal content, however for some groups that content /is/ the signal; this is meant to inform users so they can choose, not to be a nanny for them).
The above tagging would allow for aggregation to determine the 'health' of a data-pool, as well as how useful it was to the user base of a given server.
Data pools would, in themselves, be another type of tag. The built in base tags defined above would be the only 'required' ones, but a firehose of all data is crazy. Thus tags (similar to keywords) would also be attached. Advanced users (any that provide 'detailed' feedback) could 'vote' on the accuracy of applied tags including the base tags (which would be inferred as necessarily existing).
Base tags become 'groups' in this distributed database.
Critically servers aggregate and thus anonymize the tag weighting of their own userbase (even from their own user base).
Every (tag sync period) an enumeration of all non-default tags (and their yes/no vote counts) would be computed and the published result for that listed.
Also published, would be a list of the other 'servers' which this current server is aware of. SOME of these would be replication servers (which would have a non-zero weight that isn't required to be published), while others are just the servers known by other servers. Each entry would have an age; this would be the last time that the tag stats of that remote server was successfully polled (thus low entries are likely to be replication sources, BUT might be 'validation' of other servers as obfuscation).
Servers might only share post contents with authorized connections. Anyone able to do so would be able to source the other server and therefore replicate the tagged data that it chooses to cache. The other server may require something like providing account data for it to sync your server's userbase stats to it. Comparing the relative accuracy of stats would enable it to determine if your userbase is real or not, as well as how your userbase votes on things it's userbase does not. This would be the reason that (semi-anonymous) peering between even not-like-sized servers would be permitted; particularly if your own server is frugal and normally doesn't download things that aren't voted on it.
Obviously server to server communication would involve the automated use of signing keys /for the server/.
Binary groups were huge and users expected them for free. And users would download huge amounts of stuff. So it's pretty much a cost sink, and ISPs who tried to start charging (for this service that had dramatically increased costs) were faced with vigorous campaigns. At some point it's easier to just cancel and tell disattisfied customers to get a new ISP if they're unhappy.
The amount of groups distributing images of child sexual abuse created some risk (not every ISP is in the US) and things like stealth binary groups distributing porn put a bunch of people in oppressive regimes in tricky situations.
ISPs could have dropped binaries and only carried text groups. But this means putting up with groups of people who strongly held but conflicting opinions:
1) be a dumb pipe and provide everything
2) be a dumb pipe but filter spam with a Breidbart index of something or other.
3) make the news server operate to rules laid out in the ISP's ToS. (Young people may not realise but a lot of effort on the early Internet was spent on "what do we do if our users go on the Internet and start swearing?" Many ISPs had rules forbidding swearing. (At least, they did in europe))
Then www forums sprung up and they had some advantages: avatars, mods, etc.
I don't recognize the bit about "dumb pipes". News is a service, not a pipe. Very few servers carry all groups. Some didn't even carry the alt groups, that wasn't fun.
The ISPs I worked at in the late 90s did not carry the binaries groups, becuase of the outrageous storage requirements. I think that was quite common.
No provider would carry all groups, and when you looked at particular groups most providers would not carry all posts.
But you still had some people saying that providers should do zero filtering at all, ever, and that doing so was evil censorship of the worst North Korean kind. Filtering was strictly something for users to do.
That's impossible for providers to do when people are using groups to distribute images of child sexual abuse.
Many customers were happy that sporge was filtered. Most customers wanted some kind of spam filtering, even using the very tight definition of Breidbart Index.
So there were conflicts in the userbase, which got pushed onto ISP support. Since ISPs were already paying for huge storage requirements I can totally understand this being part of the consideration to cut usenet.
Not carrying all groups is not the same as filtering. I'm sure the reasoning you describe exists, but it wasn't a very mainstream view. As a tiny data point, none of the ISPs I worked at saw this as a support problem.
These have been delegated up to the forum's moderators. People much prefer the work of filtering be done by someone else. Client-side killfiling never really solved the problem of killing half a discussion.
This all more or less kind of happened eventually, but the parts that were early did not go the way you state, and the parts that did go that way were much, much much later and did not directly influence the overall fate of usenet.
All in all, no, you're only talking about some side stuff that only clouds the history.
Speaking as someone who was an extremely heavy user of Usenet back in the day, it seems pretty clear that spammers overwhelmed it well before it became primarily used for binaries, and most people were happy to turn to web forums once they were available.
(Edit: moderated Usenet groups appeared fairly early, but what I recall is that spammers overwhelmed the moderators, even there.)
Usenet never completely died btw, so that's not a fair description; there are people using it as a forum to this day, in nontrivial absolute numbers, but of course small numbers relatively, compared with web forums.
I skimmed the comments here, and never saw the real answer (to what I understand the question to be). Even though it was public knowledge, I had some extra insight from working for a large Usenet provider.
The New York Attorney General started a campaign against child porn groups on Usenet. In the end, his office identified a small number of groups they said were used for child porn -- I think it was less than 100 groups. Many ISP's jumped on the opportunity to stop paying for Usenet service.
In the 90's it was just assumed that an ISP service would include Usenet. With the growth of binaries groups, the quality of service declined. I remember retention would be a day or two, with about 50% completion. So, for most ISP's, the service was unusable, and only a small number of subscribers knew or cared about it. The others paid quite a bit for service from a third party, like my employer. I don't know why they didn't shut down service earlier, but once the NYAG campaign started, they could cancel Usenet, saving themselves money, and getting good press for fighting child porn.
I found USENET and associated newsgroups to be better than the WWW, especially for discussions of software. I once even promoted the use of internal newsgroups w/in a corporate environment, where a history of topics (discussions, problems, and decisions) would have IMO proven extremely useful.
But the idea never got traction: people were unwilling to participate because newsreaders were too different from the browser and they'd had enough trouble learning to navigate the WWW. Once blogs and browser-based "newsgroups" and forums began showing up, the handwriting was on the wall. In the end, the WWW browser's low bar to entry ate USENET.
I still value the treasure trove of information stored in the archives. And some people still actively participate in USENET and other newsgroups, just as some still participate in IRC (Internet Relay Chat, which also is fading). I think these are valuable tools with a lot of greybeard expertise held in reserve.
There's a sort of Gresham's law of the Internet: "The browser drives out every other interface."
I have to mention, the D Programming language has forums that can be accessed both by a newsreader and from a web browser[1]. It is coded in a framework called Vibe.d[2] for the NNTP protocol and HTTP[3], which I think is fascinating. My only complaint is that due to the somewhat dying out of USENET for discussion reasons, and mostly it just being used for piracy (if you do a bit of research it's still just about as active as torrenting, except it's a lot more automated, used to use USENET, I rather stick to legal alternatives and avoid the paranoia), the old clients while they still work, could still use some touching up which probably wont happen.
I enjoy the idea that if all discussions in a support forum are on the NNTP protocol, I can archive them all, so I hardly have to open up a browser to search through years (decades?) of threads to see if anyone else has had the same issues as me. Imagine something like Stack Overflow all of a sudden at your finger tips without any internet access. It's a really nice thing, sometimes the internet just dies on you when you need it most.
As for IRC, people are willing to use it, if you put something useful on there (support for a project, or a community that people are interested in). If you want adoption from users who are just browsing the internet, maybe a web client / desktop client combination that makes IRC a lot less seamless to the average "I don't know" type of user.
Yes a webforum with an NNTP gateway would be very nice. I remember vBulletin had some functionality like this and I hope Discourse will implement it.
NNTP is very nice for archiving and it's distributed. Maybe we should do twitter and discussions over blockchains... when NNTP servers get out of vogue. There is a stackoverflow dump for local use by the way.
On the subject of IRC: it wasn't fading all that strongly, IMO, at least for open-source and free software discussions. Then Slack came along. Startling to see a proprietary clone of IRC (albeit with some nice extra features, namely history) come along and start taking over. See, e.g., the Clojurians Slack community.
It's also got basic wiki type stuff, and some other features.
IRC is like USENET: It has barely evolved for decades, nobody can profit by improving the protocol as it's a commons, so it just stagnates and dies out.
Eventually all the decentralised protocols that were born in the early days of the internet will be gone.
The IRC protocol is in a different situation to USENET, because the latter has many different independent, individually federated networks of servers, whereas the latter has just one.
This means that server-side innovations can and do happen - they just need to respect the basic server-client protocol. Often the newer features are delivered through "services", which means they're in-band signalled.
Not entirely. Team control is a major part of it for companies, and history makes a huge difference to the experience. Slack servers allow the use of IRC clients, but I gave up on Colloquy almost immediately after seeing the benefits of history. For the right teams, the IRC+history combo can almost completely eliminate email use.
Also, Slack bought a company which did voice, video, and screen sharing. Since join.me went downhill, this will be a welcome addition to Slack.
USENET has always been used for porn and piracy, since at least the early 90s. Of course, most of the great newsgroups were discussions-based, but probably most of the bandwidth was porn and piracy.
When I was in college, I remember someone on my floor had written a program in Pascal that automatically downloaded porn off USENET. He would leave his computer running all the time, connected to the college's internet connection via modem, and we would occasionally see a flash of a porn pic on his screen and ask "What was that?". This was before the days of integrated TCP/IP stacks in the OS, so if I remember correctly he had to dial in via modem and then use something called Slurp or something like that, I can't remember exactly now.
This continued all through the 90s. A bunch of my friends had Airnews accounts and downloaded mp3s and porn 24x7, during what we called the "Golden Age" of piracy, when Napster was starting up in 97 up until the early 2000s, when the bust hit.
At some point, the medium for discussions moved off of USENET and went to more user friendly places like email mailing lists, google groups, yahoo groups, reddit, etc. This left only piracy and porn on USENET, and I'm actually surprised that some ISPs still support USENET at this point.
It felt like Usenet died as a meaningful place for discussion in the mid-to-late '90s, for all the same reasons that most (or all?) electronic communities eventually die. Bad posters drive away good posters and encourage even worse posters, which eventually results in something akin to YouTube. Forum entropy for lack of a better term.
By the time most ISPs started dropping it, a vanishingly small percentage of most ISPs' users even knew what it was, and the binaries groups had turned it into a source of both cost and legal risk. The heavy users were people who incurred that cost and risk to the ISPs because they were using it for pirating software and porn. The icing on the cake would've been the fact that it's a terribly inefficient way to distribute those things and the ISPs have to store all that stuff locally on servers they own.
From an ISP's perspective, maintaining Usenet feeds became all downside and no upside.
Regarding government control, I would think that Usenet would've been far easier to monitor and censor than the web.
> Bad posters drive away good posters and encourage even worse posters, which eventually results in something akin to YouTube. Forum entropy for lack of a better term.
I've heard it labeled "evaporative cooling", per [0].
A little while back I went back to some of the newsgroups that I was a regular on, back in the 2000s. It was... a bit disturbing. I recognised a lot of the names, but the people and attitudes behind them had changed a lot: insular, cliquey, a bit xenophobic, a bit crazy.
I've seen the same attitude before in tiny SIGs: go take a look the what's left of the classic 16-bit micro (or early 32-bit micro) userbases and you'll see a lot of it. But it was very sad to see happen to people I knew and used to interact with on a daily basis.
I noticed that too, my hypothesis is one must be a little off to stay in those small, shrinking echo chamber communities for decades. In the instance I'm thinking of, some of the members did indeed have mental health issues.
> It seems that the past 6 years or so saw most big ISP's dropping USENET support claiming mostly piracy concerns. Was it piracy or the fact that it's tough for the government to control what people say on USENET?
No conspiracy theories needed here.
Copyright infringement is one angle; the other is that it costs ISPs a huge amount of resources for something few people use.
Once upon a time, a single server could easily mirror all of USENET for all users of an ISP, and almost every user expected it, so they'd treat it as an essential part of the service. Now, it would take far more storage to do so, and almost nobody expects it, so why should an ISP provide it? It's easier to let people get USENET from a third-party service, and it'd be a better experience for the people who actually want it, too.
If an ISP has resources to burn and wants to make their technical users happy, they'd get far better results for more users if they provided things like local Linux distribution mirrors instead. Far more users would make use of that than USENET.
And if they want to make the vast majority of users happy, and save resources on their end in the process, they can provide local CDN nodes for YouTube, Netflix, and similar.
Usenet isn't dead. I still use several Usenet groups via Thunderbird. Google Groups is a Usenet host/client, and many groups belong to both the Google and Usenet spaces. The Usenet interface is easier to use, has no ads, and doesn't require a Google account.
I ran one of the biggest EFNet IRC hubs at Texas.Net up until we pulled the plug in '98 due to smurfing attacks. :)
Since then a bunch of friends and I have run a tiny multi-server network after we all moved off the public networks.
Started a new job this past February, and I asked my boss during the interview how we handled inter-person communication since we're scattered all over the country. "We have our own internal IRC server if you know what that is..." I said "THANK YOU JESUS!" and he cracked up.
IRC is definitely still out there and heavily used.
Heh, I set up an internal IRC server for my previous employer; I mostly just got tired of AIM being shitty, and was already using IRC so I set it up (which was a pain, btw! maybe it's easier now?) so we could all use it.
Pretty amusing, too, because it was four people in an irc room, all of whom were also in the same physical room.
> That would be exactly my own reaction if it would ever happen to me. But so far, unfortunately, it was always either Hipchat or Skype.
Both the IBM LTC and Intel OTC maintain internal IRC servers. They aren't used by the entire company, but they're used by those Open Source groups. (Company-wide IM is still Sametime and Lync, respectively, but both of those are usable from Linux with Pidgin.)
Hipchat ain't bad; the place when I got to use it was a 3-week temporary gig for me (they needed additional programmer for emergency fixes). The places where I was actually employed all used Skype, and "bearable" is the best I can say about it.
I think the best solution would be a Hipchat/Slack-like IRC client.
It's being done in a completely backwards-compatible manner - clients have to opt-in to new protocol features so it's more along the lines of incremental improvements.
Ont thing that hasn't been mentioned yet is that it is essentially closed now. You can't just set up your own USENET server easily, because you have to pay someone $$$ to get federation (if at all possible). They will try to keep competitors out, because they sell access to newsgroups for a lot of money so people can download warez.
I think it should be possible to get replicas of non-binary newsgroups, but a quick search hasn't found any free option.
The single biggest issue was spam. Being largely unmoderated, it became flooded with garbage as the reach of the Internet expanded. Conversation moved to web based forums, which IMO had worse UI in il the early days, because there was more ability to moderate.
And not just traditional spam. There were active campaigns to "sporge" thousands of news groups with thousands of junk posts. Hipcrime is probably something that would return ghits.
Oh, man. This was pretty much the final straw for the couple of newsgroups that I loved dearly back in the day (rec.arts.books.tolkien and alt.fan.tolkien). The once-vibrant community was already atrophying (due largely to a lack of newcomers and the usual gradual attrition of old regulars), and then somewhere around 2006 or 2007 there was a massive sporge flood that got past spam filters and made the groups unusable for about half a year. Once that ended, very few people were left.
Well, sorta. Yeah, spam was a big problem. The other problem is that there WERE moderated groups, and the original intent when creating moderated groups was that moderators would act like adults. They largely did not, and even if they did, they were usually strongly opinionated and would moderate according to those opinions.
I wonder just how big a non-binaries feed is these days. A tiny engineering company I worked for in '98-99 had its own Usenet server with a no-bin feed going into a SPARCstation 2 (think 386-486-class x86, equivalent) and it kept up just fine.
A couple years earlier I'd been one of the senior admins at Texas.Net (now DataFoundry) and helped build out what eventually turned into GigaNews, which used multiple dual-proc Sun E450s.. I think they're still one of the "biggest" Usenet providers these days.
USENET (the network) may be dying but NNTP is still going strong as a better interface to mailing lists. See for example http://gmane.org/ or the new GNU Mailman 3 gateway.
I am now subscribed to maybe 2 mailing lists, the rest (two dozens) I read via gmane.
I still use usenet every day. There are, admittedly, only a few good groups left. But where there's a high barrier to entry there's a high reward. The discussion is of high quality. Higher than most mailing lists and reddit/HN, at least.
There is a lot of history and useful knowledge archived in Usenet. A lot of that content (e.g., the early UNIX newsgroups) puts today's forums and blogs to shame.
Google acquired Deja News (if Usenet is wortheless, why?) and now all the archived Usenet messages are web-access only and fronted by Java and Javascript nonsense.
If the Usenet archives are no longer important or if everyone thinks Usenet is "dead", then why put these messages behind Javascript and try to prevent bulk downloads (which is how NNTP was designed to work)?
déjà news acquisition was in 2001, lifetimes ago in web time. And it does fit with Google's stated mission of putting everything online, in easily searchable format.
The "let's put a thin web veneer over X" approach has never been a great one, perhaps sufficient for Web 1.0, but not these days.
When DejaNews made USENET searchable. You could actually nuke messages from DejaNews, but then Google bought DejaNews and suddenly every nuked message were made available again forever. Google killed USENET.
It's expensive to keep binary groups online (bandwidth) and the text groups are all SPAM these days.
Edit: Forgot to say that the tech is fine; a member of my family operates a usenet server over in Switzerland for our family. Works well for that sort of thing and avoids facebook etc.
I thought it was just limited interest. First web hosting prices came down so much that anyone could run a forum, then WordPress made free blogging with ancillary comments accessible to anyone with a browser. So the masses went to forums and blogs (and then Twitter and Instagram and YouTube and whatever chat app is popular this week) and only geeks who cared enough to find and install a newsreader were left.
For ISPs there simply wasn't enough customer usage of NNTP servers to justify their continued existance. 5 years ago when I was working at a mid-sized ISP only about 2% of our customers used our NNTP servers. We carried binary groups and offered pretty good retention/completion but by then even the pirates had mostly ditched NNTP for torrents. At the time we estimated that we had maybe about a dozen customers accessing the server for non-binary / piracy use.
Going back further to why NNTP became irrelevant for discussion I'd say it was a combination of difficult setup for the average user and the lack of good free NNTP clients. Early web forums could offer discussion for free without the difficulty / expense of a NNTP client. As NNTP groups became more insular the miserable trolls were able to take over and ruin it for everyone. Almost every group I was active in during the late 90s deteriorated in this way. Just one mentally ill and/or very lonely person posting 50+ times per day could very effectively destroy a group.
No good clients? I loved Forte Agent, it was awesome. As well as it was one of the most bug free programs I've ever encountered. It just worked. http://www.forteinc.com/agent/index.php
I imagine Binary groups being a great big cost sink would be the main thing.
It's sad, because the most barebones mid 1990's 3-panel Usenet client is still an infinitely better reading experience for discussion than all current web forums.
USENET, was in my opinion, one of the best features/protocol of the initial internet. I loved USENET ever since I discovered it (91/92) and I still use today a news reader through gmane, looking down at forums (just like this one) like a sad evolution.
Having an USENET server provided by your ISP used to be a standard part of the package. They would probably not carry alt.bin, but in the end it didn't matter.
A truly P2P, open, messaging network, of the likes that you'll never see developed again due to commercial exploitation. USENET had "flaws", but honestly they're minor.
The store and forward protocol required quite some disk space at the time. This resulted in several nodes to drop bin/alt.bin from the network (which in my opinion was always a hack providing nothing really useful, with yencoding/par and the like). But by today standards, you could probably run a node carrying the entire network with just maybe a bit more than regular consumer hardware, since it's just text in the end, and the server is very simple: I used to run one server locally just for me.
How big a today's network could be?
See for yourself: http://gmane.org/stats.php
Or: not big for any ISP to carry in full with just one sysadmin.
The network relied on "control" messages to create/delete groups automatically (as opposed to manual subscription), which due to the lack of authentication/encryption in the protocol, were very easy to spoof. A gpg-signing mechanism was later put into place, so that nodes peering with each other could establish a chain of trust by themselves. This was pretty nice in retrospect (and awesome by today standards), but the main problem is that creating new groups was a slow and painful approval-based process: people often wanted small groups just for themselves, and mailing lists offered the "same" without any approval required.
Having a large open network started to become a big attractor for SPAM, and managing SPAM in a P2P network without authentication is a harder problem to solve than a locally managed mailing list. For the same reason, trolls could be a nuisance at time.
Some people claim that USENET became 1) too big 2) a receptacle for copyright infringement. Both claims are bullshit: by excluding binary groups (which was ubiquitous by 91/92 when I joined already), the network started in a consistent volume decline already in the late 90ies, so it cannot be that hosting a node in 1998 became impossible in 2005.
Here's what happened: running a local server became so easy and cheap, that running mailing list offered local control and almost zero overhead. People that had niche groups started to create mailing lists with open access, and people migrated in flock. Why share your discussions in comp.programming.functional where you could create a mailing list just for your new fancy language? (it's pretty sad, because I loved the breadth of the discussions).
Discussions on general groups became less frequent as most of the interesting ones were on dedicated mailing lists. The trend worsened significantly as forums started to appear, which lowered the barrier to entry to people that didn't know how to use a mail client properly.
The sad truth is that now most of the general discussions aren't openly shared anymore, and happen either on local websites (like this) or local mailing lists.
I've been using gmane.org to subscribe all open-access lists that I ever participated into. Most of the time, the list I'm looking for is already there.
In a certain sense, gmane.org offers the best of NNTP with the best of mailing lists: you can create your own lists without approval processes, and you can read any other list without having 1) to subscribe directly 2) having to store messages locally 3) see instantly the archive [as far as retention goes]. The mailing list administrator has still ultimate veto on what goes on the list, so that local authority is never undermined. On the other hand, NNTP clients are built to read massive volumes of messages, that would make reading reddit a breeze.
What gmane.org lacks though is the peering. One of the central NNTP features of the time was definitely killed.
I wish I was old enough to use USENET when it was popular. The closest I got was using XDCC on IRC but even then, USENET sounds like such an interesting thing to use. I miss anonymity.
I would definitely believe the reason being piracy. That's a massive portion of the bandwidth I'd almost guarantee it. It was a safe haven for a long while.
USENET still exists, so 'ultimate demise' is premature.
Still I think OP is closer than many other posts here, most of which are about its long decline rather than demise. When ISPs have stopped stop carrying USENET, it will just be a niche service with a very few providers, or only Google Groups will be left.
I think people create too many fables around old tools. It's just that 20-25 years younger people than you will think your tools are uncool. So they make their own tools, that basically do the same as your tools, then everybody uses the new tools. That's all.
Although I don't think the decline of USENET was caused by worries about the inability of governments to control it, I can answer your question.
USENET has no central point of control (or at least it had none 15 years ago when I lost interest in USENET). The basic infrastructure is designed so that any news server can get a copy of all the USENET articles being stored by any other news server, and anyone with sufficient bandwidth and sysadmin skills can set up a news server. Consequently, there's no single organization that has the power to remove an article from USENET unless perhaps it is an organization that has arisen since I lost interest in USENET 15 years ago that is central to the control of USENET spam. (I mention spam because 15 years ago, spam was the only potential reason or problem an effective alliance of server owners might recognize as a legitimate reason to remove a message from USENET.) It would have to be an organization that every significant news server relies on for telling spam from non-spam. But I get the impression that spam is out of control on USENET, which is a strong sign that such a universally-consulted spam-control organization does not exist.
Anyway, I hope that you get the idea of what the OP was on about with his implication that pro-censorship forces killed USENET because government could not control speech on USENET (which, again, probably does not have any basis in reality because IMHO USENET was never popular enough with readers to worry pro-censorship forces).
Usenet is a way for the ordinary person to be able to talk unfettered to other ordinary people - without a need for a central authority, without the approval and shilling of advertisers etc. So, after decades of the taxpayer funding R&D to create the Internet, when the Internet was handed over to corporations in the early 1990s, the question is not if such a resource was going to go away, but when.
It's a confluence of forces. The old Bell monopolies get a stranglehold on the last mile, and then wireless transmissions as well. They become so bold as to lobby to end net neutrality so they can pump more money from content providers with their monopoly. A vast infrastructure is being built to monitor what people say on the network (like the NSA's Utah Data Center) which makes the Stasi look like Inspector Clouseau, in a country quite different than the one whose Secretary of State said in the 1920's "Gentlemen do not read each other's mail". The RIAA/MPAA oligopolies are not busy yet trying to extend their 95 year copyright lease which is kicking in again in 2019, so they can try to shut Usenet down as well. After all, it's one of the rare mediums of distribution of content they don't control. I'm surprised the powers that be haven't cracked down on Internet Relay Chat yet, it's one of the last remnants of the old, distributed, decentralized, noncommerical Internet.
First problem: there's no identity authentication mechanism in NNTP. So spam is a problem, forged moderation headers are a problem, general abuse is a problem. (A modern syndicated forum system with OAuth or some successor model would be a lot easier to ride herd on.)
Second problem: storage demands expand faster than the user base. Because it's a flood-fill store-and-forward system, each server node tries to replicate the entire feed. Consequently news admins tended to put a short expiry on posts in binary groups so they'd be deleted fairly promptly ... but if you do that, the lusers can't find what they're looking for so they ask their friends to repost the bloody things, ad nauseam.
Third problem: etiquette. Yeah, yeah, I am coming over all elitist here, but the original usenet mindset was exactly that. These days we're used to being overrun by everyone who can use a point-and-drool interface on their phone to look at Facebook, but back in September 1992 it was a real shock to the system when usenet was suddenly gatewayed onto AOL, I can tell you. Previously usenet more or less got along because the users were university staff and students (who could be held accountable to some extent) and computer industry folks. Thereafter, well, a lot of the worse aspects of 4chan and Reddit were pioneered on usenet. (Want to know why folks hero-worshipped Larry Wall before he wrote Perl? Because he wrote this thing called rn(1). Which had killfiles.) Anyway, a side-effect of this was that when web browsers began to show up, the response was to double-down on the high-powered CURSES-based or pure command-line clients rather than to try and figure out how to put an easy-to-use interface on top of a news spool. Upshot: usenet clients remained rooted in the early 1990s at best.
These days much of the functionality of usenet (minus the binaries) is provided by Reddit. Usenet itself turned into a half-assed space-hogging brain dead file sharing network. And we know what ISPs think of space-hogging half-assed stuff that doesn't make them money and risks getting them sued.