Hacker News new | past | comments | ask | show | jobs | submit login
The OG Social Network: Other People’s Websites (jim-nielsen.com)
331 points by headalgorithm on Aug 4, 2022 | hide | past | favorite | 147 comments



I remember MySpace being described as "an easy personal website" back in high school.

Unfortunately, it's not easy to make an easy personal website today. Personal sites take so much work to design, and most feel stale because they don't have dynamic content like posts. About.me used to be great, but it didn't get maintained.

I'm working on a project to make personal websites as easy to set up as a social profile. No designs to pick - just a photo and profile photo. Not static - you have a mailing list and the latest post is shown on the homepage. And, people can "follow" with their emails.

Facebook used to be pitched as "the place everybody is", but that's no longer true - social networks are getting dis-aggregated. So, I expect personal sites to become a way for people to gain more control over their audiences, and to help move an audience from one platform to another (via a "link in bio" to their personal sites)

If you're interested in trying out my project for your own site:

https://postcard.page

(Will trade free access for feedback!)


I'd like to see a project like this which is self-hosted, where the process of buying a domain, acquiring hosting, etc. is made easy for non-technical users. The problem with sites like about.me is that eventually the maintainer gets acquired, gets bored, gets hit by a bus, etc. With a self-hosted site, it works as long as you keep paying Namecheap and AWS, for example. This arrangement also separates the core project owners from responsibility for whatever users slap on their page down the line.

I imagine users would go to one site (or app) to get it all running, and that site would hook them up with hosting and a domain name behind the scenes, maybe get them set up with a static page through a WYSIWYG editor, then boom done. Maybe give users a standalone app with their hosting keys and credentials that lets them update the content on their site directly (again through WYSIWYG, like Adobe Dreamwaver but easier), without going back to the initial setup site.

A platform or service that handholds users through initial setup and billing, and then steps away once they're up and running, seems to me like the right mix of centralized/devolved control and responsbility. I realize what I'm describing sounds a lot like Wordpress or Dreamwaver, but I think there's a big opportunity to slim down these offerings into a simple, secure, easy-to-use package. I guess it depends on your choice of target audience.


Email sending is surprisingly hard to get right. That's where a lot of the value for a hosted product comes from.

With Postcard - I've had to add various checks and protections to prevent bot abuse (and to manage sender reputation). It would be really hard to be reactive like that on a self-hosted product.


Is it more or less the same problem as running a GNU Mailman listserv in 2022? That is to say, a fairly hard problem of staying on top of security patches, avoiding getting marked as spam, and avoiding getting put on the big email provider's badlists. In theory, the problem seems simple: replace a manually-managed BCC list with an alias to said list, so it's less hassle for the user/sender. Maybe the answer is an email client plug-in that just autofills BCC to-addresses? Assumption would be the user's sending this email from a known provider like GMail or Fastmail.

I imagine it would be easier for the completely self-hosted email system if signup gave receivers a from-address they could whitelist, if their email provider allows it. Maybe you already do this.

Edit: I'm also assuming email is sent one-way to whoever signs up, maybe with a link to the dynamic chat server/app/platform of the moment. Slightly different from a GNU Mailman list, which enables back-and-forth conversation.


Yes, a little bit. I'm working on a different project that's more of a refreshed Google Groups / GNU Mailman listserv, and that has even more complexity for allow-lists: https://booklet.community

> I imagine it would be easier for the completely self-hosted email system if signup gave receivers a from-address they could whitelist, if their email provider allows it. Maybe you already do this.

Yeah, that could be possible. I'm not opposed to releasing a self-hosted version of Postcard - but it's been useful to not have that constraint from the beginning. For instance, email sending happens in an async queue backed by Redis - and that might just be unnecessarily complex for a self-hosted tool!


> I'd like to see a project like this which is self-hosted, where the process of buying a domain, acquiring hosting, etc. is made easy for non-technical users. The problem with sites like about.me is that eventually the maintainer gets acquired, gets bored, gets hit by a bus, etc. With a self-hosted site, it works as long as you keep paying Namecheap and AWS, for example. This arrangement also separates the core project owners from responsibility for whatever users slap on their page down the line.

Right from the get go, long-term maintainability of such a platform is very hard, as there are now a few dependencies that you're reliant upon for your self-hosted site:

- Domain registrar

- Infrastructure provider (IaaS)

- DNS record management

Trying to make the providers swappable will be a challenge, as APIs are not consistent across the ecosystem. This is compounded when trying to make the solution turn-key, as now there needs to be a developmental effort behind the project to keep it up to date, & to add in new options to swap the providers out.

> I realize what I'm describing sounds a lot like Wordpress or Dreamwaver, but I think there's a big opportunity to slim down these offerings into a simple, secure, easy-to-use package.

The pennies that can be picked up in this specific niche is too small to bother: The ones that just want a website without any frills will stick to WordPress & their providers, while those wanting extreme configurations & customizability will just run their own language/framework on the infra of their own choosing. The niche mentioned sits in between the two, with the extreme ends increasing their grasps: WordPress plugins from the easy end, and web frameworks from the other end (Vue, React, Svelte, Django, etc.).

**This kind of thinking, however, misses the forest in favor of nostalgia: The vast majority of modern Internet audiences want fast & easy discoverability/search, along with easy public sharing of posts, & account follows. Modern social media is what results from the inevitable result of catering to the market's demands. Such market wants also inevitably creates gravitational pulls towards platforms with large audiences, creating a feedback loop that's hard to break.**

Right now, there's a weak-but-present desire to decouple from Big Social Media (Mastadon, Lemmy, Friendica), but the wants of the modern audience run against the ideals of self-hosted social media taking off: Their customizability is their own Achilles Heel, with the technical & financial limitations of scaling self-hosted services making it near impossible to compete with the big guys.


For me, what social media has over a personal website is a common identity system and access control.

I would like to host my occasional posts and photos on my own site. But I don't want to post everything publically.

But requiring people to sign up to my personal website is probably unreasonable. And requiring a login implies a DB of some kind, which for me transforms it from fun into work.

I'm also not going to get excited about anything that isn't self-hosted. Sooner or later every cool startup either fails or becomes successful enough that their incentives no longer align with my interests.


> requiring a login implies a DB of some kind

If your audience could accept the basic look of it, basic auth + https will work. You can automate the creation of a new entry in the passwords file and the restart of the web server.


A compromise solution is publishing the post to your website, without having links to it from the public side, and sharing the URL only with your audience.


To be frank I don't trust my entire audience with a secret link. Someone will leak it and crawlers will find it. A link to a photo or two is fine but if they can then crawl everything it is too risky for the non-technical people in my life.


Not even someone leaking it. Email provider bots, chat bots, etc, will check it, and it might even end up in Bing just from there https://news.ycombinator.com/item?id=31892299


OAuth? You'll still need a DB, but at least no signups.


A personal website can still be as simple as dumping a single HTML file into your hosting provider's folder. Despite everything being built up in more complex ways, that hasn't changed since the beginning of the internet. In fact the lack of web style has seen a little bit of a resurgence.


Yeah, true - but I think the opportunity is a really tight integration with a mailing list. I think personal sites need some dynamic content - such as posts and a mailing list. Otherwise, they just feel stale.

That's what prompted me to start Postcard. I had a blog set up, and I wanted an integrated mailing list. It took XML hacking (RSS), $30/month to Convertkit, and lots of complexity to make it "just work". I wanted something far simpler, and built for people instead of marketing organizations.


If you use Sendfox, then all you have to do is drop your site’s RSS feed (auto-generated by WordPress, Ghost and other non-dev friendly solutions) in and it will automatically generate emails for you.

Best of all, Sendfox costs a tiny fraction of what ConvertKit does and it’s run by a more technical team.


Well, I'm going to keep on being outdated. I've got posts and a RSS feed. If you insist on a mailing list then I'm pleased to disappoint you.


What’s a decent static site generator for this sort of thing today? I’m envisioning a directory of .md’s or whatever, then I run make, and it invokes a thing that builds a tree of .html with an rss feed in it.

Finally, my makefile does an “s3 sync”.

The older and more boring, the better.


I built https://tinker.fyi using Nuxt.js - their `nuxt-content` library is great for having a markdown-based CMS.

(I'm also using this stack for managing Terms of Service on https://contraption.co and that project is open: https://github.com/contraptionco/contraption.co )


Why not build it yourself ? You'll probably spend more time tweaking template anyway. Plus, if you do it yourself you won't need anybody to maintain it.


Jekyll is old and boring as hell so the plugin ecosystem rules. I picked up eleventy in order to be able to prototype stuff on Glitch.


I just use Gulp and PostHTML to render markdown, do some light templating. It does nothing more than put HTML files in a build folder. These tools never change so it's been pretty reliable.

I found myself forgetting how static site generators work, and barely used any features. I update my site like once a year.

Idk its in the JS ecosystem so HN will probably hate it lol.


> Idk its in the JS ecosystem so HN will probably hate it lol.

Not my style, but it's your site so do it your way.


That doesn't sound very boring. That sounds like it has several unnecessary moving parts. Makefiles (and the binaries they run to actually do all the heavy lifting) tend to be more fragile and less portable than their advocates let on. I often come across repos where I don't/can't trust the Makefile finish to completion without error, so I end up cracking it open to see what it's trying to do and then just running those commands manually.

People also inevitably end up forgetting how to use their static site generator setup. (Even in your "boring" example with "just" make, you will perhaps forget the templating language.) A ripe case fit for field study: <https://web.archive.org/web/20210331182731/https://corythebo...>

> So, time to update the website, but the first wall I hit was that I: ¶1. Forgot how my over-engineered SaaS was supposed to be used (no documentation because I built it myself and was lazy) ¶2. Forgot how to follow the esoteric Hugo conventions (has documentation, but it's not easy to parse at a glance)

> I was pretty annoyed with myself for having fallen for the trap of not documenting my own systems, but not sure how I could have remembered all of the Hugo-isms, especially since I don't update this site very often and don't do static site generator work outside of this.

If think you want to use a static site generator, first try just making your site capable of self-replication. Write a document that lists all the transformation steps that should be applied to the input in order to produce the desired output, save that as something like makesite.html, dump it somewhere on your site, and have it so that when you drag and drop the directory containing your site sources onto the page, then it spits the publishable version back out. (Just make it so that your makesite.html is written for a dumb enough audience that your computer (read: Web browser) can follow the steps on its own.)

Alternatively, don't use a static site generator. Adopt a regimen where the publishable representation (what would be the SSG's output) is also the canonical representation (i.e. "source").


I use pelican[0], which works with markdown (or rst) files and makefiles to compile/deploy.

[0] https://getpelican.com/


tdarb has a nice one called pblog (https://pblog.xyz). It uses pandoc, xsltproc, a shell script, and a makefile. It's barely a SSG at all.


"Unfortunately, it's not easy to make an easy personal website today. Personal sites take so much work to design, and most feel stale because they don't have dynamic content like posts." (italics are mine)

Huh? My definition of a personal website differs mightily from yours. Today, neocities.org and a day spent learning how to write simple HTML suffices. Fact is, I'm getting ready to do this once again myself. And I can email my friends (you know -- real friends) with a link. And I've found Google to be quite good at quickly adding my site to their databases. I can always find my sites using sensible search terms in Google.

A personal website should not look like a corporate website, or like MySpace, Facebook, or Twitter (well, unless that's your obsession).

I find that a little HTML, the tiniest bit of CSS -- say 15-lines or less, some images/diagrams, and most importantly, something honest and interesting to say, is about all that's required.

The use of Pandoc can make writing HTML almost trivial, but then writing plain HTML is almost trivial anyway, even with Notepad. A personal website is not a place to track visitors, or to manipulate them with "Please Subscribe" membership assaults upon them. It's personal, it's right there in the name. By personal, I don't mean it's where you spill your guts about yourself (although, if you want, you can). It can even be a subtle ad, emphasis on subtle, for your work. Will the real John D. Cook, please stand up?

But why, in the name of all that's precious, does it have to be complicated? That's a rhetorical question, whose answer is, obviously, "It doesn't."

And about this dis-aggregated business, this "gain more control over their audiences..." stuff, that doesn't sound so personal to me.

I suspect many folks would gladly visit stale-looking sites that don't torment them with drop-down (and over) videos, and membership pleas 5-seconds after they begin reading the content they really came for. And on many sites, the absence of a comment section would be a blessing indeed -- the vitriol of the internet knows no bounds, and just being exposed to that stuff can sometimes send your mind down Negative Nellie Lane.

So to any non-technical readers who would like to create a simple site to say something they've been wanting to say to no one, someone, or everyone, I say give Neocities, or something similar a try. Wait a week, search Google for your site. Feel special. It's fun!


Hi Philip! I love the idea. I've thought about working on some similar project in the past, too.

Just a feature suggestion, can you please make it easy for people to setup RSS feeds?

Email-to-RSS solutions exist for technical people, but first party platform support for RSS will determine whether a significant minority of the nontechnical-but-literate internet (i.e. people who mostly consume and occasionally produce text) will adopt your service as a consumer.

Wordpress, for example, generates RSS feeds by default at a predictable url. Same with blogspot, hugo, ghost, and even substack.


Hell yeah! I'll work on that this weekend :-)


Looks cool, but don't things like WordPress already make this easy?


Setting up a whole wordpress is actually quite hard for the average person. Not to mention they need to also first pay and get web hosting.


Based on context, I'd assume GP was referring to Wordpress.com, which requires none of that...


My goal is that Postcard can get your site published in 2-3 minutes with nice design.

Wordpress isn't built for personal sites. Too many themes, too hard to set up, too much overhead. I think most people that want a personal site are not motivated to figure out the whole process of finding a template, designing the pages, configuring all the content sections, etc. Plus, Wordpress doesn't have an easy mailing list tool built-in.


WordPress is absolutely made for personal websites. Its origin is quite literally blogging.

There's plenty of turnkey mailing list plugins for WordPress, from Jetpack to using SendGrid, or MailChimp, all of which have free plans.


> Personal sites take so much work to design, and most feel stale because they don't have dynamic content like posts.

There's nothing stopping personal sites from interacting with Fediverse standards. Then posting to your site is just a matter of inputing it as your home "server" in your preferred Fediverse app.


I would say that requires quite a bit of technical knowledge. More than what the average person would do anyway.


P.S. - my personal site (on Postcard) is https://www.philipithomas.com


Unfortunately, the site crashed for me after uploading a background photo, and my test site is now permanently inaccessible.


Oh no, sorry! I see some image resizing errors in the logs. I'll get on it!


I'd like this same model for resumes. Like there is just a single place for RSAAU resume service as a utility.


I like this idea! I had a "Bootstrap Resume" project get popular back in like 2013. Maybe I should revive that :-)


TOTALLY revive that.

Linkedin sucks as a resume.


Wondering if Notion might not put you out of business. I have the impression they make it super easy to publish a "simple" page (could have a database) and edit it. Has seen a friend us Notion for that but haven't tried myself.


I tried super.so and was disappointed


Totally willing to do feedback for free access - this is cool


Email me and I'll send a coupon code! mail at philipithomas.com


I don’t know what you’re talking about. You can go to squarespace and set up a site in 15 mins. I know because I did it last month for my business.


When social networks started taking off, in the spirit of the late 00s there was a push to create standards for distributed social networks so that personal websites could replicate a lot of what social networks did. DiSo is one that comes to mind [1].

The fact was...people didn't care. It made building harder...not easier.

I think it's inevitable. Over time, adoption trends towards easy UX, and open standards almost by their nature lag behind silos in their ability to optimize experience.

[1] https://diso-project.org/


Wordpress had a standard called FOAF for linking websites. FOAF - Friend Of A Friend (XML based). I don’t know if it is used anymore


There's been a big push to develop modern standards in the IndieWeb community [0]. There are two important standards:

- WebMention, a W3C standard that is basically the equivalent of @ing someone on Twitter [1]. It is simply an http request to a discoverable endpoint with two pieces of data: the webpage being mentioned, and the webpage mentioning it. WordPress had a similar standard called Pingback and websites supporting WebMention often support both for backwards compatibility. - microformats2, an ad-hoc standard for adding metadata to webpages, meant especially for providing metadata for web mentions [2]. For instance, you can specify that the mention is a "like", "reply", or "reblog", and set the author name and avatar.

Independent websites that add support for this can then parse the WebMention to create a comment section and like counter and readers can follow the links to other blogs that talk about the blog post they just read. There are a decent number of personal sites that already support this, like those mentioned in [3]. With enough adapters, it might build the network effects necessary to become a viable social media alternative. Right now though, those in the network are predominantly tech oriented since there isn't a ton of third party support.

[0]: https://indieweb.org/ [1]: https://www.w3.org/TR/webmention/ [2]: http://microformats.org/ [3]: https://indieweb.org/Webmention#IndieWeb_Examples


Pingbacks are so helpful if you’re a blogger of any kind. I manage the blog for a family member who’s an author, and the huge a blog to share thoughts with the readers of their books. Pingbacks let’s them see where their blog posts are being discussed, go answer questions, and interact with the community in a really personal way (on top of the comment systems that are also used extensively).

It might’ve lost popularity over the years but it amazed me that there’s still entire micro-communities of authors/fans doing this. Their own little social networks, all self hosted and self managed through blogs and comments/Pingbacks.


There's also IndieWeb, but I decided it wasn't my time because I wanted my own website, not a web application that pretended to be a website.


The hard part is finding the interesting websites - there is such a firehose of content that we've necessitated a separate role for content curators.

I get Thinking About Things, which points me to interesting sites on the internet (https://thinking-about-things.com/). Another great one is Findka essays (https://essays.findka.com/). Would love to hear any other recommendations.


https://search.marginalia.nu/ is a fantastic curated search engine for just those sites you're looking for. :)

alternatively, the bookmarks/links page to mine has a hefty list of artist-focused sites (http://kradeelav.com/link.html), and the yesterweb webring has enough to get you going in the fun quirky webrings - https://links.yesterweb.org/


Google and Bing (and hence DDG etc) are no longer a search engines, they're "information the powers that be permit you to have" engines. Forums, blogs, and small independent websites have disappeared from indexes. Large walled garden platforms like Facebook are impossible to query.

We really need a replacement for Yahoo.



I certainly notice this too. I remember searching for technical queries and would find many (not SEOsplurge) blog posts or random forums where the question is discussed. Now it is just the same stackoverflow answer replicated across multiple content farms.

Peculiarly, I have found the more 'genuine' websites by using image search. Perhaps because more things fit into a grid, and I can usually gauge whether the website is genuine or a botfarm from the images its author chose.


Is this true?

If I search "presstv", the .com site of which was literally seized by the US govt, I get their unseizable .ir site as the first result. No scary warnings, no fluff.

If I search "rt", I get RT site as the first result.

Similarly for smaller sites that have caught flak or been banned for opposing US government narratives (SCF, CN, GZ, etc).

Ironically, it's "open" platforms like Wikipedia that have the lowest tolerance for political dissent. Twitter, Facebook are quite compromised as well.


From what I can tell, DDG said they were going to downrank known propaganda sites, which meant that, in practice, dissenting results would also appear.

Then, the pro-Russian-propaganda outlets got upset and started complaining about free speech. (As though exclusively showing sites from one hostile foreign government would be aligned with the intent of the 1st amendment!)


Wish there was something like Read Something Interesting (http://readsomethinginteresting.com/) but with a search index, where you could search only the blogosphere.


The first item I was taken to on that list is this: https://readsomethinginteresting.com/a/baa96b89

Which takes me to a broken article: https://www.aaronkharris.com/asking-questions



Sorry, I need to get in the habit of checking my comments for replies. Thank you!


.....what? What forums have dissapeared from Google? I get and search forum results all the time. This is not including of course the forums that have gone private or purposely delisted themselves


I can't surface them at all for non-programming sites. Occasionally automotive, but it's still mostly AI generated made for adsense junk. I can use site: filters on sites I know about, but I can't find anything on a page I don't know that way. Your filter bubble may be different.

I mostly use Yandex now. It's not great.


I find Brave search to be a lot better. Its result are helpful, and not filled with SEO spam.


I see tons of what you say is missing. I am in the UK, FWIW. Perhaps it depends on niche?


Discovery has become the one thing that our social networking overlords do and focus on. But we learned about new things before them just by talking with our peers. I think the IndieWeb's concept of a Social Reader[1] can do a great job of filling this need. If you like or comment on something in your feed-reader, it becomes a post on your outgoing feed that people can see. This lets any individual or site act as a source for curated content.

[1]: https://indieweb.org/social_reader


Web-rings [0] were a good way to do that discovery of allied sites back in the day.

[0] https://en.wikipedia.org/wiki/Webring


> The mistake was killing Reader to make Plus a success. Google’s judo move would’ve been to embrace the open web as a social network. Not their network but our network.

> They provide the tools – Reader, Blogger, Search — we provide the personal websites. The open, accessible, indexable web as The Next Great Social Network.

I think this confuses the chronology. It's not that Google drove people away from a decentralized social web by killing Reader. It's that users had already abandoned the web in favor of walled gardens like Facebook.

I think that Google should have kept Reader around largely for PR and goodwill reasons, but it was niche product being used by a small and dwindling (but fervent!) set of users. RSS was not a meaningful "social network" for any significant fraction of humanity. I would bet money that for every person who has ever written a blog post, there are 10,000+ people who have posted on Facebook or Instagram.

Google could have kept the lights on with Reader but it would have been a well-lit but empty room.


You're right about the basic dynamic -- convenient platforms with network effects are always going to have the bigger draw, probably by orders of magnitude.

At the same time, though, the value proposition of a social networks & media isn't linear with user volume. There's always a floor beneath which the value isn't there, but once you meet that bar there's different kinds of value available depending on who is participating, what kind of effort they're putting into contributions, and maybe even value presented by limitations/exclusivity.

The Reader / Blogger infra could have been an effort to focus on high value networks vs high volume networks. Which, you could argue, is exactly what the value proposition of Google's core product (search) was in the first place.

Blogs still aren't an empty room, so I think the floor would have always been there.


The value vs volume is such an important insight.

Facebook and Instagram know a lot about which cat memes my friends and I send back and forth to each other.

In contrast, my RSS feed knows a lot about what content I think is valuable to consume in my spare cycles. I would never get in ad in Instagram for anything software-related, but that's at least 1/2 of my RSS feed, and something I'm more likely to spend money for than whatever cat toy they think will cause me to pry open my wallet.

I think if somebody could choose the highest-signal metadata to get to know me well, they should choose my browser/search history, my RSS reader, my e-reader, and iMessage or the corporate Slack instance.

Anything else is tangential at best. I have no idea why they shut down Google Reader, it's clearly the best signal:noise product outside of Maps and Search.

Maybe with the prevalence of google analytics and device/id fingerprinting, they thought it was redundant data.


> RSS was not a meaningful "social network" for any significant fraction of humanity

Everyone who says this says it as if we've reached an agreement that when we're talking about RSS we're specifically excluding podcasts. I don't know why we're doing that. The podcasting ecosystem comprises a hugely successful social network (despite the shortcomings of the podcast syndication format—which could be alleviated by more pressure to transition to something more modern, like Atom, which already includes features that would make it possible to unlock even more of the types of things expected nowadays from social networks).


> Other people’s websites are the OG social network, and the optimist in me is going to riff on MLK’s quote: the social arc of the internet is long, but it bends towards individual websites.

I agree with the first part, individual websites are the OG social network. The second, though, that the "social" web will lean towards individual websites seems dubious.

The more "individuality" a given medium enables, the more complexity and therefore, difficulty. Most people want to just share photos and lulz with their friends, and the the landscape of social walled gardens are all trying to tap this demand.


I would say yes and no. For technical and semi-technical people, most personal sites and the like offer RSS which is universal enough that everyone can consume it their own way. For non-technical people however I agree that this doesn't work out.


There are things we can do to make RSS-based consumption much easier. I use an RSS browser plugin for Firefox[1] so that a feed icon shows up on any page advertising[2] a feed. The feed icon let's me subscribe to the site with 2 clicks (and no copy/paste). This is a space that is ripe for building better tools, especially since so much of the (non-walled) internet still exposes RSS. We even have a handful of companies getting people to pay $$ for a solid hosted feed-reader experience (Feedly, Inoreader, Newsblur, etc.)

[1]: https://github.com/Reeywhaar/want-my-rss [2]: https://www.rssboard.org/rss-autodiscovery


> I use an RSS browser plugin for Firefox so that a feed icon shows up on any page advertising a feed.

Can you imagine Firefox having such a button without the need for a plugin. That would be crazy, wouldn't it?


Nah they are spending time making the tabs larger, that's more important.


Pretty sure I remember Firefox 2 having an orange RSS button built in. This would have been 2006 though, the golden age of blogging. I can understand if they removed it, most high-traffic personal blogs started moving to social media by around 2010-12.


Although the rise of walled gardens was the biggest headwind to more people maintaining their personal websites, I think there is also a lot of issues arising from the underlying technologies of website building relative to the neophytes.

Imagine what an aspiring content creator (writer, photographer, artist, musician etc) needs to master and navigate in order to get his own social network website up: domain names, ssl, html css javascript, image editing, comments, categories and tags, rss, pings, moderating, email, mail lists, analytics, pagerank, seo, search, translate, etc. Each with different backends, different logins, different degree of tech knowledge required.

As walled gardens like FB and Myspace grew in capabilities, why didn't the blog platforms and website builders grow their design to simplify the onboarding as well? This is probably the result of venture money sucking all the best engineering talents away from the open source and free platforms..

I remember distinctly, in early 2000 trying to help a friend to create his own web site to be a museum of his life's work of writing. After trying to teach him all the underlying web infrastructures and technologies, he came to the conclusion he should just hire a web consultant.

Walled gardens simplified the chores of posting online, and they do a very good job for millions of non-techies. Without this huge step, we would have lost even more creative output from the past 20 years.


> why didn't the blog platforms and website builders grow their design to simplify the onboarding as well?

Not sure I understood you correctly but isn't this exactly what e.g. WordPress did - and pretty darn well too?


> email, mail lists, analytics, pagerank, seo, search, translate, etc.

I thought we were discussing personal sites. Why does a personal site need analytics?

I'm not sure what "pagerank" means, unless it's SEO specifically for Goo.

SEO in the sense of reasonable page design, comes for free with a builder like Wordpress. Other kinds of SEO are for scammers and the get-rich-quick brigade.

Search is cool; but translate, for a personal site? Is it too much to expect visitors that don't know your language, but want to use your site, to resort to an online translator? You could even link to it.


Does Google really want to achieve their stated mission?

I would like to see Search Engine Optimization ended with extreme prejudice.

No search engines are immune, but Goog is the epitome of of SEO taking over the first few pages, destroying the "information" that they want to make "universally accessible and useful".

I am also seeing search results where the first few pages are all auto-generated articles, SEO optimized. The content is gibberish, incomprehensible word salad. But, it is long enough, linked enough, and my keyword and synonyms appear sufficiently enough to show up in my search.

Google Reader dying was a sad day indeed. So was usenet dying was a sad day. So was the town square post board, and before that the town crier.

These are not blocking Goog to achieve their stated mission. It is stating such mission, but not doing anything to achieve it.

Talk is cheap.


How is it possible to get rid of SEO? Search engines have to have some mechanism to determine order, and websites will get more clicks if they are at the top… those two things together will guarantee that sites care about optimizing their appearance in search results.

How would you stop that?


Anecdotal note on SEO.

I used to be really big into SEO. I used to do things all the time by the book. I worked to get really good inbound links, worked to find great directories to link from, and was really into "white hat" SEO.

Fast forward a few years into the late aughts. Had a client who needed a site in less than two months. Some 15 pages of content, design, CMS and a new logo to boot. Was going to pay me a lot if I could do it.

I got it done and was sweating bullets because I had essentially copied all the content on his site from other sites in the same industry. I essentially aggregated all of the big players content into his site. I was petrified the site would get ranked and yanked and I'd spend months trying to get the site back into the SERPS.

Guess what happened?

Within a month, the site was outranking all the bigger companies in his industry. The site was one page 1 of Google in less than 60 days. All the companies he was competing with were at the bottom of page 1 or pushed to page 2. It never dropped lower than #3 on the first page in Google. The site broke nearly every cardinal rule of SEO - duplicate content, keyword stuffing, fake inbound links, etc. and still managed to be on the front page of Google for a host of long tail search terms and has continued to stay on the front page.

It just confirmed Google doesn't care about enforcing any of its rules any more. If I did what I did, I can only imagine what other people are doing to push their site into the top of the rankings.


Right. In all likelihood, the only way to do this is to make editorial choices: decide that some review sites are better than others, and start penalizing highly optimized garbage by its domain alone in a transparent manner.

I would use a search engine like this, but I understand why an existing trillion dollar multinational basically can't do this.

On the other hand, I do a lot of cooking, and recipe sites are forced to have these absurd winding narratives to appease the SEO gods. Nobody likes this and it really ought to be a solvable problem.


I would have thought a Bayesian machine would be able to filter out sites that looked too much like autogenerated word salad.

Or just exclude all basic pages (e.g. recipe) that insist on having a table of contents.

I know that when I'm searching for answers and the page has a TOC, 9/10 I close it out.


The height of the HN middlebrow dismissal.

Just do some Bayesian filtering, SEO solved. Can't believe those idiots at Google never thought of it.

Do you seriously think spam filtering is this easy? Just one bayesian machine away from solved?


I do a search for "how to barbaz a foo" on google.

I get the following domains as the top 5 results:

  1. allaboutfoo.com
  2. fooexperts.com
  3. infoonfoo.com
  4. foonation.com
  5. thefooblog.com
The linked pages above all have

  1. A table of contents section
  2. About 30mb of AI, autogenerated filler ("Why should you foo?", "How does foo affect your dog?", "Can you foo a foo?", "Why you should foo twice a day?")
  3. The same content plagiarized from each other's sites, just slightly reformatted/edited
  4. NO ACTUAL INFORMATION ON HOW TO BARBAZ THE FOO
It's 2022, and there's still no way to filter this garbage out?


I'm sure there is, but there is an enormous corpus of data, and any filtering you add can end up impacting the legitimate sites you're trying to get to.

SEO spam is constantly adapting to changes in filtering. If you start filtering sites that have a table of contents, SEO spam will remove their ToC in no time, but authentic blog poster will probably not.

Real content producers don't have the time to chase every change to google's algorithm but SEO spammers do. How do you filter out the spammers who adapt to changes without affecting the real content producers who can't afford it?

I promise you, the problem is harder than you realize. And sure as shit a lot harder than "just add a bayesian filter and these 3 hard coded rules I just came up with"


I hate it when pages DON'T have a toc. I wanna see the headings with links so I can jump to the part of the page I care about. Not scroll through hoping to find it...


> Does Google really want to achieve their stated mission?

Consider what happened to "Don't be evil".

> I would like to see Search Engine Optimization ended with extreme prejudice.

> destroying the "information" that they want to make "universally accessible and useful".

> Google Reader dying was a sad day indeed. So was usenet dying was a sad day. So was the town square post board, and before that the town crier.

See also what happened to Rap Genius.

> Talk is cheap.

But so persuasive and effective, something that is best controlled if one is in certain lines of work.


I'm quite happy with the results of you.com (not affiliated in any way, just a happy user).


I just hope that DDG are first to solve this SEO problem


They can't, its just a UI for Bing. If you want something actually independent, use Brave Search.


Usenet isn't dead.


Where/how do you read it? I would like to go back to it.


It is interesting, most of the SEO nonsense is not because of Google, they say words to the effect of 'put the customer first and write for them' in their docs and that makes sense. Analytics will pick up content not keyword stuffed bulls4it for robots.

Just for the lols, put together a site with content, submit it to Google and see how many clicks you get. Nobody is going to beat a path to your door, even if you are selling 100% pure gold blocks for tuppence a kilo.

I could go on AWS and build a website in minutes from a CMS but in practice, even if you know you can do that, it actually takes a really long time to build and test a website. Even if you are not being held back by dysfuntional teamwork and can just do it all yourself, in your skill level, it will still take a while before that new website is actually working.

Even simple things like email. It probably just works fine out the box. But if you then decide you want a proper mail server, thinking how hard can it be?, then you will go down a rabbit hole. All you want to do is be able to communicate with people but you have to learn a mountain of stuff that you would not need to know if you just went with Squarespace.

Google have also been very bad at semantic elements. If you mark up your docs in divs and spans then that is fine as far as Google is concerned. It is what it looks like, not what it is. You would think they would like documents that use the more meaningful elements such as article, section, aside and so forth, but no.

Because of this we have search engines reverse engineering the structure and content of pages using mystery AI. Nobody is encouraged to do the equivalent of using 'styles' in Word and they are doing the equivalent of using Word as a typewriter, manually bolding headings rather than using a heading style.

We could have an army of highly literate SEO people essentially editing content to make it better for the customer, better organised with sensible HTML elements and with better English (or other language).

But instead we have an army of people who think that repeating words on a page will bump you up the search rankings, which is at the opportunity cost of not writing better content.

WYSIWYG editors are also evil, based on an outdated metaphor. They don't enable you to logically group your content with sections that start with a heading for the screen reader, the section scoping the relevance of the heading.

We all have a love hate relationship with Google, but, if you look at it objectively, they are not really taking the web in the direction Tim Berners Lee had in mind at that content level.

I have found that the few blogs that I do read now are like discovering the old web, but it is something you have to consciously remember to do. Google Reader and RSS really was the magic ticket for making it work, not to mention web rings...

The thing is that there was never any pressure to get RSS working. In a typical agency a decade ago there would be a grand effort to put these social network shares on everything to drive traffic but the people calling the shots on that never really got RSS and never made developers get that working nicely. It was not in the project spec but the silly Facebook/Pinterest/Tweet links had to be done.

Hence the demise is not just on Google, it is with the snake oil cargo cult SEO belief system that has no idea what the Google algorithm is, but they think they can second guess it better than any developer.

If you can't do then teach. If you can't teach then teach Geography. This was once a popular saying. Don't judge.

The modern take: if you can't code then do website design. If you can't do website design then do SEO.

A SEO person has the time to look at various Google dashboards and to make recommendations. For example, your TTFB is a bit high, can we have that lower? The programming team can then say 'what do you think we have been working on for the last six months?'.

Given where we have come, I would like to see a future where everyone hosts their own stuff. By default your home wifi is a server load balancer with sufficient oomph to hold anything you want to share. Out the box you just put in your URL and there you go. No need for AWS or others, your ISP puts your stuff on a CDN with your permissions. A soveriegn web.


> Nobody is encouraged to do the equivalent of using 'styles' in Word

The main search engines support schema.org for marking up the "semantic" data associated with a webpage. Typically, this triggers "rich" search results. It goes underused because most commercial web sites don't want to make their semantics so transparent to outsiders, they would rather try to get increased traffic. But this also makes it a potential value-added for more grassroots-based websites that will generally care less about how much casual traffic they get, and more about discoverability for their content.


I would love if we went back to more personal websites, but I don't see what the catalyst would be. Google reader is a good point, it's been 10 years since it's gone and we still don't have any replacement in the sense that it's something that non-technical people at large would be aware of it. Heck, most people don't know what RSS is. Look at Wix, WordPress,... - setting up your own site has never been easier, yet so few people do it.

In reality, we're moving to ever simpler UX solutions - TikTok doesn't even require searching for and connecting with friends before you get your dopamin kicks. The next big network will probably be one that's figured out how to continuously serve people personalized content without them having to actively like anything or make active choices (maybe using pupil dilation or whatever).


> TikTok doesn't even require searching for and connecting with friends before you get your dopamin kicks

It is a shame this is how it went. Despite the many ills of social media when I was coming of age, the connecting with friends and sharing random comments was what made it worth the time. It was just another medium of interaction in my friendships. If I went back on to facebook, I'd still find long threads from those formative days.

Now, the human touch is even further buried amid the endlessly funnelled consumption.

I want to be an optimist and say this will strike a peak and we'll go rolling back down the hill to user-centred websites again.


Why do social networks persist as “gateways” to information?

They offer a consistent experience. You’re less likely to be inundated with the random crap found on a random website. People prefer the mediocre, yet consistent, social media “meal” over the fancy meal with possible high degree of variability in quality.


I'll argue social networks are just a bit more than consistent gateways.

They are a supernormal stimulus. The algorithm feeds you and in many ways hijacks you. There's a reason social networks are dead set against chronological feeds.

They give you this steady drip of "engagement" which often means playing into strong (and negative) emotions.

You don't get that in a true straight content feed. Sometimes content is boring or just not excitable.


Also part of the issue: most new content that people look for isn't in text form anymore. It's mostly image and video. Yeah, they have YT, but that's not where the younger generations are. Google can't (or at least doesn't) index every 5 second tiktok or days and days of twitch VODs. Either for legal reasons or because it's just too much data that's too difficult to parse usefully. For Google's sake I hope it's not the latter.


"OG Social Network: Other People's Websites"... nope, not terribly OG really.

For those on the early internet there were things like mailing lists, Usenet, and others which allowed communities to form online.

For those not on the early internet there were still network supporting social interaction and had well developed communities. BBSs, FidoNet, and even consumer oriented online service providers such as CompuServe existed before the internet was widely available to average consumers and pre-dating the existence of websites by some years; such services were absolutely social with email, forums, and real-time group chat rooms. I developed many far flung friendships through many of these social networking services... and all well before 1990.


Shameless plug: I have been building https://www.HeyHomepage.com as an easy-to-use website builder, with RSS/OPML built-in for what I call 'really social sites'. It has a style creator that inexperienced people can use, it doesn't depend on external code and even the visitor statistics work without GA.


Lets not forget news groups. Google had everything to match sociaal networks many years in advance.

Youtube also had video replies and threaded discussions before they made it impossible to find your conversation and impossible to figure out who is responding to who.

plus was a hideous pile of trash. A 12 year old could easily design a superior experience. The turd was optimized to gather personal data and didn't have anything else.

Wave and Knol looked interesting but who wants their precious workflow and writings to depend on a company that will inevitably kill it?

What a joke


The quote about Google reader really hit home but I had no idea they were still killing so many services: https://killedbygoogle.com/

Don't they want to be the biggest cloud provider by a specific year or something, why would anyone use this company for something as important as cloud infrastructure?


I've been working on this project for a while, trying to get at this problem — https://mmm.page

Feel free to look at some of the pages that people have created: https://showcase.mmm.page


> Perhaps history will conclude killing Google reader killed Google, an overly-simplistic conclusion but poetically ironic nonetheless.

Or maybe it was an omen to their own decline.


I do feel like social media platforms are lowering the barrier to entry to “put stuff out there” in a way that is way more appealing the to average person. I do wish personal websites would come back but I don’t really see a path towards that unfortunately. There are tools out there that make building a website easier but a lot of them are either really technically challenging (I’m talking coding your website, HTML, JS, React) or paid (Squarespace) which I get the sense just can’t compete with creating a “free” account on a social media platform if all you want to do is share some photos.

I would really like personal websites to take off but I’m not quite sure I see a valid reason or incentive for individuals to do so?


The article defines social networks as being the media silos of facebook, slack, discord, and newsletters. I'd argue that in the case of facebook, it's no longer really a social network. And in the case or newsletters it depends. However, if we focus on slack and discord, the important part of slack and discord that make them 'social networks' is missing from the open web and here's why:

Social networks are composed of links (dyads), so an extension to the web makes sense, but search engines, web crawlers, and even 'browsers' break those links as holding value as a dyad. I might know about your website, but I don't hold any special access, or privileged relationship to that site. Anybody browsing through can see the same content and have the same access. With slack and discord, access is limited to those with a specific relationship to the poster. The same is true of newsletters, but in those cases it can be just a consumer relationship, whereas in slack and discord the relationship is bi-directional.

In actuality, the article is arguing for the absence of a social network, a completely open network. That's a fine argument, but it's unlikely that most people want to share every aspect of their lives with the entire world, so the need for closed social networks will continue.


websites aren't a social network because they're not a network at all. there's no genuine connection between the sites apart from the occasional accidental link, but websites themselves don't interact in any structured way.

The 'semantic web' or RSS, as the article mentions, which are protocol-like solutions suffer from three things. The first one is that they're slow. Standards for large sets of people are by necessity developed through collaboration, and they cannot compete with the speed of private products. (point also often made by Moxie at Signal)

Secondly, these kinds of internet protocols rely on uniform, machine-readable content. Rich, personalized, app-like experiences are hard to index, Search engines are starting to suffer from this, it's a reason everyone puts "reddit" at the end of the search and tries to get the opinions of people, and why they're not super relevant in China which leapfrogged the 'plain text' internet into private platforms.

Thirdly with protocol based solutions there's no good answer or even incentive to make sure the information is correct or protected, this was one of the major problems of the semantic web. Spammers, attention seekers, fraudsters and so on can game these systems with impunity, siloed company-controlled platforms can fend these bad actors off.

I think each one of these is a deep, structural issue and I haven't really seen a solution to any of these by people who still advocate for returning to individual sites, and I think the trend in what people consume points into the other direction.


Agree with your whole comment except the first sentence. When I was in a webring years ago, I'd interact with my ring-mates pretty often.


"...killing Reader to make Plus a success..." What? Do people really think that was why they killed reader?

I strongly doubt that there was much overlap between reader users and the intended audience for Google+. It's not like your average social media user has ever even heard of RSS yet alone use it for anything. Can you imagine anyone on tiktok using RSS instead? Of course not.


Internet users in 2012 are not the internet users of 2022. 10 years of devolution have made anything harder than "swipe up for more" impossible for the majority to use. It wasn't always the case.


Social media is very wide term, and arguably RSS is itself social media.

There was in fact an overlap between reader users and the intended audience for Google+, lots of bloggers were invited and joined Google+ when it was invite only.

I can definitely imagine people using RSS feeds for short portrait mode videos, just as people use RSS feeds for podcasts. Just as we have podcast players to consume this kind of RSS feed, we might have tiktok like apps to consume another kind of RSS feed.


TikTok changed all of that. Now we have networks of interest. The individuals do not matter. Labels of their content are now the drivers.


I don't see any reason why we can't have a tiktok like recommendation engine for the web. (In fact I think that's what bytedance started with[1]? Would be very interested in using an english version of that, maybe RSS based.)

[1]: https://en.wikipedia.org/wiki/Toutiao


No idea about TikTok (never used it, or Twitter/Facebook/MySpace/etc.); however, that sounds similar to the "groups" feature of Identi.ca (now GNUSocial).

I always preferred that way of interacting, since I prefer to see what anyone says about a topic, rather than everything said by some individuals.


Or subreddits on Reddit, for that matter.


TikTok is not personal for most people, it’s just a way to consume content.


A lot of comments here like "we tried this", "people didn't want this" and "that's not where things are going".

As far as I see it, the timing is perfectly ripe right now for change. People are on the lookout for nutritional, good-for-you alternatives to the sugar diet of social media we're being served. It's not everyone, but no longer just tech enabled folks either. Just because we've been going in a bad direction, doesn't mean that's how it will always go.

Maybe I'm being optimistic, but isn't that what's generally needed for positive change?


Yeah I think it's a serious trap to only look for mainstream appeal. The long tail has been severely neglected the last 10 or so years, and I think this is an opportunity to be part in creating something new. Like who cares if it doesn't become the next Facebook. Maybe that's a good thing.


Agree. It's always a safe prediction that the status quo is what the future looks like but thinking like that only serves to stifle innovation.

I for one am very interested in alternatives, and here to support anyone helping to dig us out of where we've found ourselves.


Bring back Web Rings and Link Exchange.


I completely agree with the post, however blaming slack and discord for being closed siloes of information isn't entirely true when compared to the OG... IRC.

Consider IRC logs. If you didn't have a bot recording conversations and publishing them on a website, you had no records either.

Sure they're currently recorded and siloed, but a bot can be written to record slack/discord etc. and publish on a website just the same.


I made basically the exact same point as this article does, but here on HN, almost eight years ago:

https://news.ycombinator.com/item?id=8550133


Then HN/Reddit become the social aggregator Google Reader never get to be.


Has a self selection bias though. Those who don't think or write, take pics won't have any reason for a site.

I only know a few people with a website, but all of them use some social media.


absolutely love the site. Also love the stats sections. I used to have it on my personal site too https://podviaznikov.com. Going to put it back.

I used to have stats about books I've read and trips I've took. Definitely putting that back. Also will add that feature to my CMS.


Yeah a social network that 0.05% of the population can realistically set up is not really all that social.


The OG social network is GeoCities.


i will just point out the mild irony of most of us discovering this blogpost... on HN


It's not that ironic, because the fuel for our comments is all the websites. We use HN to discuss but the vast majority of content is outside HN.

Unless you consider HN comments to be the actual content, like many people, me included.


What’s funny in this thread is multiple people saying they are working on it. Yet HN is a common place to see social media bashing and comments about people leaving X platform, etc. I doubt the demographic here knows enough about say teenager social media usage (for example) to build The Next Big Thing in social media.


Well, HN comments are indexed, and HN itself links to >90% third-party websites. Aggregators like HN would likely be part of an (itself unlikely) resurgence of individual websites. It's like an interactive version of the old Yahoo.


A bit. It all depends on how you use HN. Every time I find some interesting site I add it's feed (rss/atom) to my reader (QuiteRSS). Usually stuff on HN has a feed.


Right? That blog isn't even in a webring.


for OOG, I'd say mailing lists: SF-Lovers started ca. 1975, and I'm still on a few. (although the subscriberships tend to now be, compared to their heydays, in pour-out-a-40 mode)


Bring back Geocities!


Right? Browsing https://randomgeo.city/ it really feels like a social network, so many random life stories on there.


Blogger/RSS is all we/Google ever needed.


The problem of the "walled garden" also seems to be a solution to the "bot apocalypse" -- unless you put your site behind a paywall or a bot-proof login, it will be swarmed with automated traffic.

It's unfortunate that the defences, the "hardening" of sites leads to silos, but within these silos the very same forces (attention mining, etc) are leveraged for profit.

Beyond the wall the scavengers will eat you, within the walls you're put to work on a track.


Remember website rings? Lol


You down with OPW (Yeah you know me), you down with OPW (Yeah you know me) !


The OG P2P network...


Other People's Projects?

i'm down with OPP




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: