> Or we are talking about different eras. I'm on about early 90s.
I think you're misremembering then. There _was_ no web development to speak of in the early 90s. The web was largely a niche technology until the mid-90s. Mosaic released in January '93, Netscape in October '94, and IE in August '95. By the end of '93, there were a total of 130 websites[1], most of them from universities and research centers. By the end of '94, a whopping 2,278 websites. JavaScript first appeared in September '95 (Netscape), and CSS in August '96 (IE).
> You didn't need Javascript most of the time and CSS incompatibilities were easy to remember (I'm talking about cognitive overhead here)
Depending on what you're building, you still don't need JS most of the time today. The difference is that today all browser implementations are ECMAScript compliant, and the core functionality is much more capable than in the 90s, so you can get by with just sprinkling JS where and when you need it, without resorting to frameworks, build tools, libraries, and any of the complexities commonly associated with modern frontend web development. This is without a doubt, an objectively better state than what we had in the 90s.
Of course, actually relying on external dependencies would make your life easier, so the difficult task is picking the right technology to use from a sea of poorly built and maintained software. This is the drawback of a platform exploding in popularity, but it doesn't say anything about the web itself.
As for CSS, how can you honestly say incompatibilities were easy to remember? Netscape was initially pushing for its own competing styling format, JSSS[2], and it didn't officially support CSS until version 4.0 (June '97). Even then, not all CSS properties were supported[3]. So it wasn't even a matter of remembering incompatibilities; developers literally had to target specific browsers, and even specific versions of browsers. Vendor prefixes were required for pretty much everything, and are still used today, though thankfully, core CSS features are widely supported, and they're only needed for advanced features. There's no way that all of these incompatibilities were easier to deal with in the 90s.
> That's in the region of 10 years after when I'm talking about. A completely different era. By that point the web had already turned into the shitshow it is now.
jQuery appeared precisely as a response to the lackluster state of JS in browsers, and to make development easier by not worrying about browser incompatibilities. My point is that up until then, web development _was_ a shitshow.
> Its not a sentiment. It's a fact
Funny how I can disagree with a "fact" then...
> The hard part is finding something that will still be around in 5 years time.
It's really not, unless you're chasing the latest hype train. jQuery is 17, React is 10, Vue is 9, etc. And like I said, you don't strictly need any of it. If you write standards-compliant HTML/CSS/JS, it will serve you for decades to come with minimum maintenance. You've been able to do the same since arguably the late 2000s.
> Who writes plain HTML and JS?
Many people do.
> There's so much bloat required to get anything to look modern that nobody writes plain web sites any longer
That is factually not true.
> That's literally how sites were originally written. It's not a new invention
I'm not saying it is. My point is that you can still do that today.
> I think you're misremembering then. There _was_ no web development to speak of in the early 90s. The web was largely a niche technology until the mid-90s. Mosaic released in January '93, Netscape in October '94, and IE in August '95. By the end of '93, there were a total of 130 websites[1], most of them from universities and research centers. By the end of '94, a whopping 2,278 websites. JavaScript first appeared in September '95 (Netscape), and CSS in August '96 (IE).
My first website went public in 1994. Before then I was writing stuff purely for a private intranet. So I'm definitely not misremembering.
By 1995 I had released an online RPG (it was very rudimentary but it worked).
By around 1997 (give or take, this was a hobby project so cannot remember the exact year) I had a full 3D web site available via VRML. Wasn't much of a success because most people didn't have 3D capable graphics cards back then. I think it was a couple of years before 3D accelerators became the norm.
1998 I was experimenting with streaming HTML chatrooms (that required a lot of hacks to get working because we are talking pre-AJAX here) and bots written in Perl.
For most of the 90s I was on the cutting edge of web technologies. So I remember the era well.
> This is without a doubt, an objectively better state than what we had in the 90s
Is it though? Better capabilities doesn't always equate to something being objectively better. Particularly if those capabilities are a complete clusterfuck to code for, as the current web standards are.
True elegance of an ecosystem isn't about raw capabilities, else we'd still be writing everything in assembly. Its about the ease of which it is to accomplish a task. I'd argue that the current web isn't elegant in the slightest. A polished turd is still a turd.
> Of course, actually relying on external dependencies would make your life easier, so the difficult task is picking the right technology to use from a sea of poorly built and maintained software. This is the drawback of a platform exploding in popularity, but it doesn't say anything about the web itself.
The problem isn't the choice. The problem is that "the right technology to use" is more about what's in vogue at the moment than it is about that's mature.
When you look at other popular technologies, you still have choice but there's also mature stacks to choose from. The moment anything web related becomes "mature" (and I used this term loosely here) the next generation of developers invent something new.
> jQuery appeared precisely as a response to the lackluster state of JS in browsers, and to make development easier by not worrying about browser incompatibilities. My point is that up until then, web development _was_ a shitshow.
It was. And it's a bigger shitshow now. Glad you finally understand the point I'm making.
> Funny how I can disagree with a "fact" then...
That doesn't mean I'm wrong ;)
> It's really not, unless you're chasing the latest hype train. jQuery is 17, React is 10, Vue is 9, etc. And like I said, you don't strictly need any of it. If you write standards-compliant HTML/CSS/JS, it will serve you for decades to come with minimum maintenance. You've been able to do the same since arguably the late 2000s.
jQuery isn't recommended any more. React isn't popular any more. Vue is probably the only item there that has merit and that's still less than a decade old.
You talk about "decades" and cannot list a single framework that is still in widespread use and more than 10 years old.
> Many people do.
Many people also solder their own CPUs. But that doesn't mean anyone does it for stuff that actually matters.
> That is factually not true.
Yes it is. Simply saying it isn't doesn't disprove my point.
> I'm not saying it is. My point is that you can still do that today.
And you can still hand solder your own CPU today. But that doesn't many anyone does that for professional sites.
The only reason people stick up for the current status quo is because they either don't know any better or Stockholm syndromed into living with the status quo.
As someone who's written software in more than a dozen different languages for well over 3 decades, every time I come back to writing websites I always feel disappointed that this is what we've decided to standardise on. You're points that its capable aren't wrong. But that doesn't mean it's not still a shitshow. Raw capability alone simply isn't good enough -- else we'd still be writing all of our software in assembly.
So yours was one of the first 2,278 websites? Congrats.
I don't see how any of your accomplishments are relevant, but thanks for sharing.
So your point is that the web when JavaScript and CSS were in their infancy, before web standards existed and were widely adopted, before AJAX and when you had to use "a lot of hacks" to implement streaming... that _that_ web was somehow easier to work with than the modern web? That sounds delusional.
VRML, along with Java applets, ActiveX, Flash, and a myriad other technologies around that time were decidedly not web-native (i.e. a W3C standard, implemented by all browsers). They only existed because the primitive state of the early web was incapable of delivering advanced interactive UIs, so there were competing proposals from all sides. Nowadays all of these technologies are dead, replaced by native web alternatives.
> Better capabilities doesn't always equate to something being objectively better. Particularly if those capabilities are a complete clusterfuck to code for, as the current web standards are.
Which particular standards are you referring to? Native HTML5/CSS3/ES2015+ are stable and well supported standards, and you've been able to target them for nearly a decade now. Their capabilities are obviously much greater compared to the early web, but this is what happens when platforms evolve. If you dislike using them, then I can't convince you otherwise, but I'm arguing against your point that the state of the web was somehow better in the 90s.
> The problem isn't the choice. The problem is that "the right technology to use" is more about what's in vogue at the moment than it is about that's mature.
That's a problem caused by the surrounding ecosystem, not the web. How is this different from VRML being replaced by X3D in 3 years? The good thing is that today you can safely rely on native web technologies without fearing that they'll disappear in a few years. (For the most part. Standards still evolve, but once they're widely adopted by browsers, backwards compatibility is kept for a long time. E.g. HTML4/CSS2/ES5 are still supported.)
If you're talking about frontend frameworks and libraries, again: they're not a standard part of the web, and you don't have to use them. If you do, it's on you to manage whatever complexity and difficulty they bring to your workflow.
> True elegance of an ecosystem isn't about raw capabilities, else we'd still be writing everything in assembly. Its about the ease of which it is to accomplish a task.
I fail to see how all the improvements of the past 20 years made things more difficult. The capabilities have evolved because user expectations have grown, and complexity arises from that. But if you were to build the same web sites you were building in the 90s with modern technologies, like your streaming HTML chatrooms site, you would find the experience vastly easier and more productive. This is an objective improvement.
> jQuery isn't recommended any more.
Because it's not needed anymore, because JS has evolved leaps and bounds since 2006, and implementations in all browsers are standardized. It's still the most popular JS library by far, and used by 77.3% of all websites[1].
> React isn't popular any more.
It's in the top 10 most popular JS libraries. And how come you're judging based on popularity anyhow? Above you were criticizing choosing technologies based on what's "in vogue at the moment" over "what's mature". React is a _mature_ UI library, and is a safe choice in 2023, unless you're chasing the latest hype train.
> You talk about "decades" and cannot list a single framework that is still in widespread use and more than 10 years old.
JavaScript frameworks as a concept are barely a decade old. React isn't a framework, it's a library. Like I said, jQuery is the most popular library and is 17 years old. Underscore (2009), Bootstrap (2011), Lodash (2012), and many more, are still in widespread use today.
But my point is that _today_ you don't strictly need any of them to build advanced interactive experiences. If you do want to, though, there are many to choose from that simplify development of modern UIs, without being a "clusterfuck" to work with IME. htmx, Lit and Tailwind are all lightweight, well maintained, and help with quickly iterating without resorting to full-blown frameworks. If you do want a framework, Svelte is now 7 years old, so quite mature, and is very pleasant to use.
> Yes it is. Simply saying it isn't doesn't disprove my point.
I thought the fact that you're reading and typing this on a forum built with simple HTML, CSS and minimal amounts of JS would make this self-evident. (The fact it uses a bespoke backend is irrelevant; this could just as well be served by a mainstream backend stack.)
But to save you a web search, here are other examples courtesy of ChatGPT[2].
> As someone who's written software in more than a dozen different languages for well over 3 decades, every time I come back to writing websites I always feel disappointed that this is what we've decided to standardise on.
Nice humblebrag again, but if you'd be willing to accept that the web has grown exponentially since the days you were building websites before JavaScript and CSS existed, that there are orders of magnitude more web developers and software now than back then, and that the core web technologies are more mature and stable than they've ever been, then you'd be able to see that the status quo is not so bad.
I have more issues with the modern state of centralized mega-corporations and advertising ruining the web than anything I can complain about the technology itself. But that's a separate topic.
> So your point is that the web when JavaScript and CSS were in their infancy, before web standards existed and were widely adopted, before AJAX and when you had to use "a lot of hacks" to implement streaming... that _that_ web was somehow easier to work with than the modern web? That sounds delusional.
My point was that the amount of hacks required these days has grown exponentially.
> VRML, along with Java applets, ActiveX, Flash, and a myriad other technologies around that time were decidedly not web-native
Ofcourse they weren't. I never implied otherwise.
> Nowadays all of these technologies are dead, replaced by native web alternatives.
Indeed. Technologies that are exponentially harder to write the same code in. Hence my point: modern web tech is a shitshow.
> Which particular standards are you referring to? Native HTML5/CSS3/ES2015+ are stable and well supported standards, and you've been able to target them for nearly a decade now. Their capabilities are obviously much greater compared to the early web, but this is what happens when platforms evolve. If you dislike using them, then I can't convince you otherwise, but I'm arguing against your point that the state of the web was somehow better in the 90s.
You're fixated on that point and it's not what I said. I said it was easier to grok in the 90s and has just gotten worse over time. Which is a fact.
I also said the current web is an unfit clusterfuck that people are Stockholm syndromed into believing is good. Everything you've posted thus far reinforces that Stockholm syndrome point.
> > React isn't popular any more.
> It's in the top 10 most popular JS libraries. And how come you're judging based on popularity anyhow? Above you were criticizing choosing technologies based on what's "in vogue at the moment" over "what's mature". React is a _mature_ UI library, and is a safe choice in 2023, unless you're chasing the latest hype train.
I haven't worked with a single engineer, how hasn't bitched and moaned about React. And I've managed a lot of engineering teams over the years.
Vue is a different matter.
> JavaScript frameworks as a concept are barely a decade old. React isn't a framework, it's a library.
It's both. The term "framework" has an pretty meaning in software and React falls under that heading quite comfortably. What's happened, and why you're confused, is that kids have overloaded the term with "web framework" to mean something more specific. React on its own isn't a "web framework" in the trendy web sense but it's still 100% a "framework" in the stricter software development sense.
This is actually another great example of the lack of consistency in the web ecosystem.
> But my point is that _today_ you don't strictly need any of them to build advanced interactive experiences.
You never had to. You're making another strawman argument because you're not only claiming I'm saying you need these frameworks (you don't) but also making it sound like this is something that's only come about because of the modern web (which isn't true).
> I thought the fact that you're reading and typing this on a forum built with simple HTML, CSS and minimal amounts of JS would make this self-evident. (The fact it uses a bespoke backend is irrelevant; this could just as well be served by a mainstream backend stack.)
HN is far from your typical website. lol
> Nice humblebrag again, but if you'd be willing to accept that the web has grown exponentially since the days you were building websites before JavaScript and CSS existed, that there are orders of magnitude more web developers and software now than back then, and that the core web technologies are more mature and stable than they've ever been, then you'd be able to see that the status quo is not so bad.
It's not a "humblebrag", it's an illustration that my opinion comes from years of experience using a multitude of different technologies. Honestly, I think you need to diversify your experience too because your comments fall firmly into the Stockholm syndrome bracket I described by the fact that seem completely unwilling to accept that we could have all the same power of the current web but massively more simplified and elegant if we were to redesign things from the ground up. There are so many footguns that developers need to learn simply because of the way how the web has evolved. And all you keep harping on about is that "its powerful" -- sure. But so is assembly. Yet literally no-one advocates writing commercial desktop software in assembly.
The problem here is trying to convince someone that the domain which they earn their living from is a shitshow, is simply always going to be met with opposition because you have no impartiality. Whereas people like myself and the OP do. And that's why we make the comments we do when we say that the web is unsatisfying to develop against.
I think you're misremembering then. There _was_ no web development to speak of in the early 90s. The web was largely a niche technology until the mid-90s. Mosaic released in January '93, Netscape in October '94, and IE in August '95. By the end of '93, there were a total of 130 websites[1], most of them from universities and research centers. By the end of '94, a whopping 2,278 websites. JavaScript first appeared in September '95 (Netscape), and CSS in August '96 (IE).
> You didn't need Javascript most of the time and CSS incompatibilities were easy to remember (I'm talking about cognitive overhead here)
Depending on what you're building, you still don't need JS most of the time today. The difference is that today all browser implementations are ECMAScript compliant, and the core functionality is much more capable than in the 90s, so you can get by with just sprinkling JS where and when you need it, without resorting to frameworks, build tools, libraries, and any of the complexities commonly associated with modern frontend web development. This is without a doubt, an objectively better state than what we had in the 90s.
Of course, actually relying on external dependencies would make your life easier, so the difficult task is picking the right technology to use from a sea of poorly built and maintained software. This is the drawback of a platform exploding in popularity, but it doesn't say anything about the web itself.
As for CSS, how can you honestly say incompatibilities were easy to remember? Netscape was initially pushing for its own competing styling format, JSSS[2], and it didn't officially support CSS until version 4.0 (June '97). Even then, not all CSS properties were supported[3]. So it wasn't even a matter of remembering incompatibilities; developers literally had to target specific browsers, and even specific versions of browsers. Vendor prefixes were required for pretty much everything, and are still used today, though thankfully, core CSS features are widely supported, and they're only needed for advanced features. There's no way that all of these incompatibilities were easier to deal with in the 90s.
> That's in the region of 10 years after when I'm talking about. A completely different era. By that point the web had already turned into the shitshow it is now.
jQuery appeared precisely as a response to the lackluster state of JS in browsers, and to make development easier by not worrying about browser incompatibilities. My point is that up until then, web development _was_ a shitshow.
> Its not a sentiment. It's a fact
Funny how I can disagree with a "fact" then...
> The hard part is finding something that will still be around in 5 years time.
It's really not, unless you're chasing the latest hype train. jQuery is 17, React is 10, Vue is 9, etc. And like I said, you don't strictly need any of it. If you write standards-compliant HTML/CSS/JS, it will serve you for decades to come with minimum maintenance. You've been able to do the same since arguably the late 2000s.
> Who writes plain HTML and JS?
Many people do.
> There's so much bloat required to get anything to look modern that nobody writes plain web sites any longer
That is factually not true.
> That's literally how sites were originally written. It's not a new invention
I'm not saying it is. My point is that you can still do that today.
[1]: https://en.wikipedia.org/wiki/List_of_websites_founded_befor...
[2]: https://en.wikipedia.org/wiki/JavaScript_Style_Sheets
[3]: https://en.wikipedia.org/wiki/CSS#Difficulty_with_adoption