> First problem: there's no identity authentication mechanism in NNTP.
Yup. Which is weird, since every message is sent by someone at a host; it should have been possible to simply use signatures to prove which site generated a message—and punish sites which didn't police their users. But crypto was hard (and illegal to export, once upon a time).
> Because it's a flood-fill store-and-forward system, each server node tries to replicate the entire feed.
The upshot of this is that reading news was fast. So fast that folks these days can't believe what a user-friendly experience reading from a local news feed can be. Imagine reading a website where pages come up in milliseconds—that's Usenet on a local feed.
> Consequently news admins tended to put a short expiry on posts in binary groups
Frankly, binaries were Usenet's downfall. Had those been eliminated, then I hope that Usenet would be a lot healthier today. I couldn't get numbers on the size of a daily text feed nowadays, but I imagine it's pretty manageable.
> Upshot: usenet clients remained rooted in the early 1990s at best.
Back in the 90s I used a pretty nice Mac GUI client.
But really, it doesn't get better than gnus…
Anyway, Usenet's not dead—it's still alive, people are still posting and some groups are doing pretty well.
They world could use a new Usenet, with the lessons learned from the first one: site-to-site; non-commercial; anti-spam measures built in.
Imagine reading a website where pages come up in milliseconds—that's Usenet on a local feed.
Now that you mention it, I think the web has never reached the speed of Usenet ages ago.
I propose the following law:
"The size of web pages must expand so that, regardless of CPU and bandwidth improvements, a page will load slower than a Usenet post on a 9.6kbps modem in 1993."
Dunno if you've tried a gopher client recently; but it's shocking how much faster than the web it is... (though still holds nothing on a local mailbox or news spool).
Yup. Which is weird, since every message is sent by someone at a host; it should have been possible to simply use signatures to prove which site generated a message—and punish sites which didn't police their users. But crypto was hard (and illegal to export, once upon a time).
> Because it's a flood-fill store-and-forward system, each server node tries to replicate the entire feed.
The upshot of this is that reading news was fast. So fast that folks these days can't believe what a user-friendly experience reading from a local news feed can be. Imagine reading a website where pages come up in milliseconds—that's Usenet on a local feed.
> Consequently news admins tended to put a short expiry on posts in binary groups
Frankly, binaries were Usenet's downfall. Had those been eliminated, then I hope that Usenet would be a lot healthier today. I couldn't get numbers on the size of a daily text feed nowadays, but I imagine it's pretty manageable.
> Upshot: usenet clients remained rooted in the early 1990s at best.
Back in the 90s I used a pretty nice Mac GUI client.
But really, it doesn't get better than gnus…
Anyway, Usenet's not dead—it's still alive, people are still posting and some groups are doing pretty well.
They world could use a new Usenet, with the lessons learned from the first one: site-to-site; non-commercial; anti-spam measures built in.