Hacker News new | past | comments | ask | show | jobs | submit login
Websites have evolved back to static HTML/CSS/JS files (paramaggarwal.substack.com)
448 points by diablo1 on March 8, 2020 | hide | past | favorite | 299 comments



There's nothing wrong with static HTML/CSS/JS. There's also nothing wrong with a rich SPA. And again, there's nothing wrong with using some kind of dynamic server-side HTML/CSS/JS presentation (like WordPress).

Where there is a problem is the culture of software engineering, and the tendency to select the newest technology stack of the day for inappropriate applications. I think any seasoned software engineer has acquired the skill of successfully selecting the correct architecture for the project goals (even if it is _boring_ to actually work on).

Whereas, take the junior to mid developer streams. Every time there's an opportunity to learn something new and break the grind from doing [X] for the past [Y] months, there tends to be unwarranted justifcation on why "we should use this new thing" despite there already being a well-known and battle-tested solution sitting right there.

For example: We need to build a basic company website. Someone from reception needs to be able to update the contact form and trading hours around five times per year. Someone from marketing needs to update product brochures once a month.

Seasoned engineer: Let's use any well-tested CMS such as WordPress and install a caching front-end.

Junior - mid engineers: EW WordPress! Doesn't that use PHP? Hahaha. Hahaha. Why on earth would we want to be known for using _PHP_!?!? We'd be _far_ better to just create a front-end on react and then write a microservice that will allow our staff members to update content. If marketing says SEO is necessary, we can _simply_ add a server-side rendering layer for the react front-end and deploy it onto a second microservice. And to make sure it all works, we can subscribe to some monitoring services, write a full test suite and CI pipeline. Everyone can install monitoring on their phones so if it ever goes offline, someone can immediately sign in remotely and debug which micro service is broken.


We're back to "The boring solution has a long list of inconveniences and uglinesses that everyone knows about. If we chose the new solution, nobody would know where the bugs are and we can have exciting new ones."

Really the problem is that the standard solution hasn't improved. Backends are still a hassle in the 2020s. People keep reinventing CGI.

Every now and again I think "why doesn't someone just produce a simple standardised solution that would be cheap to host and easy to write", and have to lie down until the madness goes away.


> "The boring solution has a long list of inconveniences and uglinesses that everyone knows about. If we chose the new solution, nobody would know where the bugs are and we can have exciting new ones."

Is this C vs Rust again? (it's a joke, but go ahead, downvote).


Oh, it's practically general purpose life advice. You could even apply it to some people's second marriages.


Rust is great, but it's so young that even the ugliness we knew about from a couple months ago is better known than the ugliness of today. The latest stabilisation of async/.await has me hard at work forgetting all the pre-futures-0.3 stuff I'd learned.


I have a hard rule to downvote whenever anyone asks to be downvoted or complains about being downvoted. Enjoy!


Backends are still a hassle? In what way do you feel Spring (Boot), ASP.NET Core and the like don't solve the problem adequately?


You still have to write one at all? And it's an entirely separate process from the frontend, and usually in an entirely different language? And many places hire different developers to do it?

Maintaining a static site is easy and places like github pages will host it for free; you can put a "SPA" in there if you like for no additional cost. But as soon as you need a backend it gets more complicated and expensive. Still cheap in absolute terms, but relatively more complicated. Oh and you need to keep an eye on it for security. I know the long list of reasons for this situation, but I'm questioning why it has to be this way.

Back in the 80s there were proprietary monolithic single-user systems for producing what were effectively interactive databases, such as dBase and FoxPro. There was also the legendary programmable hypertext system HyperCard. And yet developing a small web application to do something that could be knocked up with a few screens of dbase or hypercard is much more time-consuming.


I use Dokku, which is like Heroku, but on my own server. To setup a backend, I just write the code with whatever I want (I use Elixir) and I use a local DB to test, using an env var to get the DB host. Then, I create an app on Dokku, a server, and deploy:

Dokku install:

    $> wget https://raw.githubusercontent.com/dokku/dokku/v0.19.13/bootstrap.sh;
    $> sudo DOKKU_TAG=v0.19.13 bash bootstrap.sh
Now, go to the server's IP or the domain you have setup previously and configure it with an SSH key and some other options.

Create an app:

On the server:

    $> dokku apps:create application_name
    $> dokku postgres:create application_db
    $> dokku postgres:link application_db application_name
Deploy the app:

On the client:

    $> git remote add dokku ssh://dokku.servername.tld:22/application_name
    $> git push dokku master
Once more on the server, to get TLS certs set up and auto-renewal (only the first time per application):

    $> dokku letsencrypt application_name
    $> dokku letsencrypt:cron-job --add
Then it just builds automagically, and as long as you use some sort of took to coordinate your database schema (for my use I use ecto migrations), then you don't need to do any more than this, really. You might have to do some limited configuration in your repo, but not much (add a .buildpacks file, then maybe a buildpack.config and a app.json).

For a static site, you literally push a repo with a `.static` file and an `index.html` in the root dir (you can do more complicated things, too).

It's not quite as powerful as k8s or whatever, but it can run small to medium sites easily, and it does have an Alpha-quality scheduler to push to k8s that I haven't tried yet, but should work as well as the Docker backend, eventually (and I'll switch to that when it's fully capable, too).

With DigitalOcean, I hear you can do 1-click Dokku servers if you don't want to do it yourself (I haven't tried this).


Give someone an excel file and they are perfectly happy. Give them a web app to do the same process and suddenly their "needs" forces you to completely abstract all of the concepts to something entirely unrecognizable from the start.

It seems to me I see more and more a growing distance between how data is stored in a sql db and how the data is actually rendered or represented in the application.


Oh gosh Spring Boot is such a dumpster fire. Dont even pretend its good!


For this typical use case, a statically rendered page with some kind of editing interface would work great, no need for databases.


Why is this downvoted? I'm a big fan of PHP, but I feel something like Netlify CMS + Jekyll on Netlify is appropriate for this. It would be super fast (statically generated and on Netlify CDN) and secure (no database or app scripts) and includes an editing interface.

With WordPress I don't worry about the PHP part, but for the tendency of people to bloat it by installing plugins and not updating it enough.


How does something like Netlify CMS work for an end user?

I mean how do you skip the "storing some sort of state in DB including a users table" step?

It is very typical for a small to medium business site client to want a minimal CMS and ability to add a few more users with edit rights.

With Netlify CMS it seems the build happens from a git repo so wouldn't it require for end user to learn git?

I love how fast and easy it is to publish static sites on Netlify but I do not see how one could host client sites on Netlify without requiring constant hand holding.


Netlify requires adding a few static assets to your site that handle authentication and CRUD actions. This then updates your git repo. The format of the posts are in Markdown, so it may not be the most user-friendly, but the end user would not have to learn git.

https://www.netlifycms.org/docs/add-to-your-site/

https://www.netlifycms.org/docs/add-to-your-site/


https://www.contentful.com/ works well for this.

Content, copy, even menu items, metadata etc can be edited by multiple normal people. Translations too

A static site can be set up to rebuild when someone updates content.

You could also fetch content with JavaScript on the page. That would work if it was just opening hours as in the parent example


Wordpress gets a hard pass for me specifically because (a) it has a plugin system that I've seen explode in practice far too often to trust, and (b) the task specifications provided by OP include marketing and management access to the editor. There is no stronger force behind pet features and rule breaking. I'd choose something easy to work with, but with an intentionally limited scope over anything with shiny bells and whistles in this specific case.


Netlify and Jekyll are kind of more oriented towards develoeprs, not Janine in reception.

Regarding WordPress and plugins:

    <?php define("DISALLOW_FILE_MODS", true);


I'd rather Janine ping me three times a year to change some string than deploy PHP with mysql.


Php with mysql is amazing. I don’t get the “i’d rather have Janine ping me three times an year”. What if you leave work? What if you are too busy with your next hobby project that you don’t give a shit about Janine? What if she wants to update brochures every now and then and your manager says you’re too expensive to help Janine.

Tech is about automation. Php + mysql still serves well in this era. Wikipedia still runs on it and it’s a pretty fantastic and responsive site. So does Facebook to some degree I suppose.


>What if you leave work?

Next dev will have easier time changing string in plain, straightforward code or static site generator than migrating plugins from PHP 5.3 or whatever.

>Tech is about automation.

Tech isn't about "automating" things that ultimately cause you to do more work to maintain status quo. Tech isn't about spending week to automate things that take you 1 minute three times a year.


>Seasoned engineer: Let's use any well-tested CMS such as WordPress and install a caching front-end.

I thought that was your junior response... you update it once a month, seems like WordPress is seriously overengineered for something managed internally. Static website + a 1h ticket once a month seems much better.

You'll save in performance, security, and it over a few years, it would take you less work time than doing a full WordPress.

If it's more than once a month, now it may be more interesting to go toward a WordPress, but that would be doubtful.


then you find out that you need to run the site HA for reasons, and you learn just how much of a shitshow it is with WordPress.

Sincerely, someone with way too many Wordpress sites.


I'm not sure it's junior/senior so much as personal taste: Plenty of junior engineers use Vim and Emacs, neither of which are trendy in any meaningful way, and I'm sure there are senior engineers who use Visual Studio with all of its whiz-bang. Similarly with websites, some have ideas which need AJAX and, possibly, WASM, and others have ideas which need static sites or, at most, "slightly dynamic" websites with only a little JS mutating the DOM.

More specifically, it isn't always up to the engineers:

> We need to build a basic company website.

Marketing is inherently trendy and the people designing the website, as in dictating what it needs to do and look like, are the marketers, who are influenced by the "cool" websites they've seen. Marketers follow trends. It isn't really negotiable in that profession. The engineers do their best to deliver on what marketing wants, and if that requires a WASM re-implementation of NyanCat to prove that we're retro-hip and DownToFax, so be it.


> Plenty of junior engineers use Vim and Emacs

Pretty sure that’s not the case anymore, I have been a developer for 3 years and have never ever seen anyone use either of these for work


Well, since you've only worked for 3 years I bet you haven't seen many different workplaces either. I've been a professional developer for 3 years and 80% of the people I know use either vim or emacs. The difference between you and me however is that I'm aware that I have been to very few workplaces and don't assume that most people use vim or emacs because of my limited experience.


The reason for choosing new technology on the developers part is for job security. If everything were still php you would have outsourced his job to India ages ago. The fact is that the developers need to do this because of the ding dongs mbas in management.


This feels like a response from a seasoned backend engineer. Seasoned frontend developers value user experience and design — thus the tendency to go "back to static" and rarely back to WordPress.


To me Wordpress sounds exactly a suggestion you'd get from someone who doesn't have to maintain it. I.e. frontend dev.


How so? I maintain two WordPress websites and really "maintaining" means I get an email telling me it automatically updated every so often. I spend more time on it restarting the server because it was updated to a newer kernel (it runs on Debian Stable, to give you an idea for how often that happens).

I do agree that if you need plug ins, that appears to get messy very fast, but the default WordPress site does everything I want and requires very little for maintenance.


On the other hand, Wordpress is for the reality that most people don't have a developer on hand (nor should need one) to update their website.


I believe the grand-grand-parent used Wordpress as an example to his/her argument. You can use any other battle tested CMS solution if Wordpress is too heavy for your needs.

For example, I like Ghost (https://ghost.org/) for my personal needs.


Huh... Maybe I haven't done anything too complicated in WP, but I didn't find it bad.

Worst case I use a dedicated PHP page and MySQL. But in reality I think I usually put my code in functions.php(?) And it works fine.


Have fun getting junior engineers interested in PHP, an objectively gross language. You'll get a stable system to begin with, but nobody younger than 45 who wants to touch it. It also has a large attack volume and curated security exploits that you buy into when you inevitably use plugins.

This article is about static HTML/CSS/JS. You can accomplish a good, modern CMS in halfway-decent languages using netlify and any static site generator framework you want without making your devs miserable.


See, the beauty of the solution suggested is that no one has to write any PHP (You could probably avoid even seeing any, if that's your goal).

Wordpress does what is required out of the box.

In fact, you don't even require a developer to do it. It's a point and click operation of selecting a theme, installing a cache add-on, add an SEO add-on, and hell, there's probably a "office hours" plugin as well.

If the receptionist isn't a technophobe s/he can probably do it themselves.

If you want to assign resources, get a designer in to make it look pretty. Wordpress has everything else covered.


know a number of products from venture-backed startups written in PHP where most of the team is usually between 25-35.


I get what you say but WordPress is a weak example. It's so fragile security wise (especially if some plugin holds you back on updates).


Another problem is that Google and some other search engines have only just started interpreting JS and since they determine the fate of most projects based on their search index, you'd better use a design that Google is able to turn into indexed pages.


Google was able to parse and execute JS for at least 6 years, I remember having a complex SPA in 2015 that didn’t pose any problem to Google.


I can't see JavaScript SEO ever being a thing, especially when most people these days just add SSR to their stack and call it a day


Except that usually someone from marketting or the reception still wont touch the website, because they think, that they can save time by delegating that back to the developers. Wordpress has some kind of terrible design decisions backed in and yes, there are many more elegant and long term effective ways (it's not about getting it online in 30mins) to build a company website.


This is a cultural issue within a company that has nothing to do with a suitable technical solution.


So true about a website. The recent time I don’t even bother about hosting, I choose the most simple to manage and teach others. For a simple page there are services like https://tilda.cc . Gatsby suddenly improves Wordpress. And Kontent.ai is just a great editing experience with do_it_as_you_want frontend.


the issue is that younger people want to do the use and learn the newest thing to advanced their careers, while us old timers just want to get shit done. you can't blame them, we we're all like that at one point, it's why we got into programming to begin with.

this is one reason why i fully believe companies should have a give each developer a half-day or day to work on new thing for the company so that new developer get to scratch that itch without trying to force using something new just for the sake of using something new.


My thought exactly.

Also, if companies stop requiring 4 years of experience in a technology that came out 3 years ago in their job description that would also help. Instead, let people transfer from similar technologies and let them learn it on the job.


> I think any seasoned software engineer has acquired the skill of successfully selecting the correct architecture for the project goals (even if it is _boring_ to actually work on).

Some experienced developers get it, but I have argued with plenty who are busy keeping up with the hype cycle and resume driven development.


Wordpress is awesome except for gutenberg. They shouldn't have gone down that route.


Users love it though.


If by "users" you include developers who spend most of their time with the system, the two thousand 1-star reviews ¹ will beg to differ.

¹ https://wordpress.org/plugins/gutenberg/#reviews


No, by "users" I meant... users. Those developers have basically nothing to do with using Gutenberg to write content, why do you think their opinion should prevail on that of actual users?


It's also users not just developers if you read the reviews.


Apologies to be so late to the discussion (I come from HN search as I'm searching for a good solution to build a simple website).

Why not use Wordpress --> static generator --> Netlify (aka free hosting)?


>Junior - mid engineers: EW WordPress! Doesn't that use PHP

I related too hard.

I had an intern say something like this but didn't give a reason. So I asked "why?". He couldn't give me an answer.


This is something that I see espoused by "web traditionalists" for lack of a better term. Sure, there are a lot of web developers who are not optimising for performance but I think the difference is that it has simply shifted to the frontend where it's more noticeable.

The kinds of people who care about performance in the backend are exactly like the kinds of people who care about performance in the frontend. There's a variety of techniques that make sense in both areas to create a snappy, seamless experience. You need experienced engineers to create performance no matter what your domain is.

We've gone down the route of creating SPAs to largely replace experiences that would require desktop applications (it was only 5 years ago that I had to download an executable to do my tax return, now I do it with an SPA). Rather than create an SPA for a blog or a simple marketing site, we've largely replaced traditional "heavy"[0] infrastructure (PHP, Apache, MySQL just to display static content) with static JavaScript-generated sites. Many of these can simply be hosted on a CDN because they have no database and often no need for complex routing.

Pretending that SPAs were part of some dark ages is just silly. Horses for courses. If your application requires what I've dubbed "heavy" architecture (above) then by all means! Such infrastructure allows a high traffic service to be highly available, but is not required to serve a single .html file.

[0] When I say "heavy" I'm mainly talking about the attack surface and maintenance required. WordPress is great for serving a blog but has been the target of many hackers due to the amount of security vulnerabilities it has had over time.


>You need experienced engineers to create performance no >matter what your domain is.

Actually the funny thing with HTML is that the simplest things are fast. If you just write a old school HTML web page without a bazillion frame works and not going too heavy on graphics things are usually fast.

And doing that is literally so much simpler and easier than learning all the super complicated frameworks of the day.

So it's a progression from initially fast for simple, to slower and slower as people learn the complicated (and slow) frameworks, and then eventually maybe fast again if you have an expert who can make all this mess fast again.

Most people never reach that expert stage of course, but sadly still leave the "easy and fast" stage.


This perspective is reasonable, until you start applying it to highly interactive / dynamic GUIs.

Real-time chat is the most obvious case where the html purist won’t offer an acceptable experience.

Another example: any page that displays a row/table of data that the user wants to sort (frontend libraries make sorting instantaneous, compared to full page refreshes with html-only).

There are lots of people using complicated frontend frameworks in cases where pure html from the server would obviously be better. But it’s difficult to argue all libraries are bad and slow for every application.


> Real-time chat is the most obvious case where the html purist won’t offer an acceptable experience.

The retro approach is to have two frames. One to submit, one to slow-stream a bunch of html elements which we can thankfully now lay out via flexboxes which are highly efficient when it comes to relayout, i.e. only the appended elements would incur some runtime costs, the already rendered ones don't get recomputed.

I have seen chan-style imageboards that update posts as people type (every dozen milliseconds) with 10k posts in a thread. The layout recomputation overhead is really low.


> ... slow-stream a bunch of html elements

Slow-stream?


Keep the socket open and write a div to it each time a new line is posted. Browsers can and do render partial documents.


> Real-time chat is the most obvious case where the html purist won’t offer an acceptable experience.

???

  <pre>If you have any questions:

  Phone:    <b>0123456789</b>  <a href="tel:+18123456789">[call me]</a>
  whatsapp: <b>0123456789</b>  <a href="https://wa.me/0123456789?text=I'm%20interested">[message me]</a>
  facebook: <b>html.purist</b> <a href="https://www.facebook.com/messages/t/html.purist/">[message me]</a>
  </pre>
https://jsfiddle.net/r3u5yvk8/

I'm now on their contact list.


> Real-time chat is the most obvious case where the html purist won’t offer an acceptable experience.

I can’t recall the last time any website offered me an acceptable embedded chat experience (not counting dedicated clients like Discord) so that’s all for the better.


Table sorting in the front end only works if all of the data exists in the front end though.


Yeah if you actually want to sort datasets of any meaningful size, you’ll be making requests to return database results anyway since a database is going to be infinitely faster and more comprehensive than any client side solution.


Most of the web bloat is not used to create highly interactive GUIs.


Server side rendering is not really that slow, and SPA is not really that fast if your bundle size is huge.

More important is that most users will not able to distinguish the difference. I have asked my friends how they feel to compare Facebook to a traditional MVC web site in term of speed difference, none of them can feel the difference.

However, the transition from page to page do affect the user experience. As a user, when I want to do a task that involve many UIs and I need to click multiple pages, I will think it is a bad experience, but I do accept to switch to another page for another task. Therefore, It makes sense to have a SPA for some rich UI feature so that user can remain in the same page


SPAs being the darks ages is over-the-top, but being an SPA when other web architectures could better serve the user is a dark pattern. That tax app doesn't need to be an SPA. JavaScript powered interactions make the experience nicer for the user, but that doesn't mean it needs or is improved by being a single page.


I think an SPA can be much more user-friendly than a traditional series of server-rendered pages.

Email is a great example of this - GMail, at launch, had much better UX than existing server-rendered applications like Hotmail.

You can find bad UX everywhere - in SPAs and server-rendered interfaces alike - but there’s nothing implicit to SPAs that reduces usability.


I feel like Gmail has lost that edge. Email being a text medium should be fast, but Gmail using up many hundreds of megabytes of RAM and taking many seconds to load really feels like they've lost sight of what reading an email is about.


I too despise the "kitchen sink" approach seen in Gmail's UI where they are trying to cram every unwanted and unnecessary feature into your view and RAM at once.

If you used "Inbox for Gmail" before Google abandoned it and miss the much cleaner and focused UI, this Chrome extension upgrades Gmail to have the same styles:

https://chrome.google.com/webstore/detail/simplify-gmail/pbm...


Gmail wasn't a single page at launch, so its UX advantage had nothing to do with that.


Wasn't long-polling the killer feature?


No, it doesn't need to be, but it is nicer. There's quite a lot of complexities to my country's tax law (as with any country's tax law, really) and it's heavily simplified by giving you highly dynamic choices. Trying to use my country's immigration site, which has some JavaScript sprinkled on but is largely static, is a largely inferior experience.


There's no reason you can't query a backend for the complexities and present those dynamically to the user. For instance, quite often an SPA might ping the backend for a list of states or provinces once you've selected a country.


Sure. It's up to the engineer implementing said app to determine the best fit for the project.

At the end of the day, you have to choose what's right for your users; whether that's a seamless experience with no full page reloads, or the occasional full page reload.


If it's a web-based user interface as this tax app sounds to be, it is arguably improved purely by being an SPA. Requiring full page reloads just to submit or change some tiny bit of data in a page is objectively bad UX.


Besides doing my taxes on paper, I can't think of anything more tedious than filling out tax forms via a vanilla HTML form.

At least my SPA tax software makes the form filling a little more exciting, and doesn't s#!t the bed when I hit enter by accident


Simple form validation would prevent that, not exactly SPA worthy.


Personally I wish more forms did have enter bound to submittal


A problem is, that I need to fiddle around with JS blocking, even on official websites from city, state or gov, because they hired inresponsible web devs or made inresponsible decisions to let Google and others track me on their website. So initially they will be blocked. As far as "experts" for SPA are concerned, they often do not even include a noscript tag, which means I see a white page. Now it cannot become less user friendly than that. I've yet to see a SPA, which offers a working alternative via noscript tag, when I blocked all their tracking. The thing is that SPA all too often serves as a way to force tracking onto the unsuspecting user. "You need to allow our JS! Your browser is outdated! Use Chrome!" and the unsophisticated person will think: "Oh well, lets just whitelist this page dammit!"


Mostly I want my forms to be reliable, that doesn't seem to be the case for many SPA's. Getting a blank page is a very common experience and much worse than a barely discernible delay submitting a form.


"Single page" isn't the why it's the how. It enables you to take a JavaScript-first approach which in turn makes building complicated applications easier.


> It enables you to take a JavaScript-first approach

Uh-huh...

> which in turn makes building complicated applications easier.

What? This is vacuous market-speak.


SPA gives you all the state stored client side, backend can be reduced to simple stateless functions. Anything persisted longer goes in the DB, and plenty of solutions exist that automatically keep the client side view of the DB and the backend in sync automatically, even across temporary loss of connectivity.

Compared to the bad days of server side session management, SPAs are far simpler to code and to reason about.

The elimination of the back and forth between client and backend means far less testing, and far simpler testing.

It is also arguably more efficient. Yes it uses more resources on the client compared to doing everything server side, but cpu and memory needs to be used no matter what. Treating web browsers as dumb terminals when the cheapest smartphone is at least quad core is a bit silly.

That said, there isn't any excuse for bad client side code that murders CPUs, but not using the client for anything but JPEG decoding doesn't make much sense!


To word it more technically, building an SPA allows you to build applications as declarative composable functions which are deterministic, as opposed to the spaghetti code commonly seen in the days of jQuery and PHP which is harder to read, reason about, debug and maintain due to their imperative and nondeterministic nature.


> building an SPA allows you to build applications as declarative composable functions which are deterministic

How is that not the definition of a function or a method?

Deterministic in what sense? It’s output? It’s input?

Bad code exists on any language. Looks like you’re comparing bad code/coding practices to good ones, that’s another thing and completely irrelevant of the language.


> How is that not the definition of a function or a method?

It basically is and that's the point. With SPA's you generally structure it so your entire app is just a function. The same cannot be said of traditional server/client apps which consist of multiple moving parts (PHP backend / jQuery frontend) which do not have an explicit contract with each other, you're just taking shots in the dark hoping your jQuery selectors match up with what the PHP backend spits out.

> Deterministic in what sense? It’s output? It’s input?

What? There is only one sense in which something can be deterministic, which is given the same input you always receive the same output. PWAs are generally deterministic because they're just functions that take an input and give you back a DOM. Non-PWAs are non-deterministic because what you ultimately see can change depending on what generated HTML the server gives back, despite being given the same user-input, it can also change based on unrelated side-effects, for example, imagine you have some jQuery feature that resizes an element, and another one that changes it's background-color. The behaviour of both features are non-deterministic because the outcome of each depends on implicit state accumulated externally by the other one.

> Bad code exists on any language. Looks like you’re comparing bad code/coding practices to good ones, that’s another thing and completely irrelevant of the language.

This has nothing to do with languages, it's about project architecture and programming paradigms. The classic webserver+jQuery type architecture is inherently imperative. Of course it can have little islands of functionally pure, declarative code, but the overall software architecture is still imperative.

In a nutshell, traditional non-PWA software cannot be written in a fully declarative paradigm due to their nature, every non-PWA app on the web is intrinsically a jungle of imperative code, making assumptions about accumulated state. PWA's are generally written declaratively, where all state and dataflow is explicitly managed, and output is deterministic based on a given input.


> The same cannot be said of traditional server/client apps which consist of multiple moving parts (PHP backend / jQuery frontend) which do not have an explicit contract with each other, you're just taking shots in the dark hoping your jQuery selectors match up with what the PHP backend spits out.

If you don’t know how your code will work, that’s an issue with the quality of the code.

> What? There is only one sense in which something can be deterministic, which is given the same input you always receive the same output.

Agreed.

> Non-PWAs are non-deterministic because what you ultimately see can change depending on what generated HTML the server gives back, despite being given the same user-input,

With a language like PHP, if you call a function twice with the same input, why would the function return 2 different outputs?

> it can also change based on unrelated side-effects, for example, imagine you have some jQuery feature that resizes an element, and another one that changes it's background-color. The behaviour of both features are non-deterministic because the outcome of each depends on implicit state accumulated externally by the other one.

If your application also does this, the only difference is that you’re not using jQuery for it.

The output of this is deterministic. Given jQuery works, the element will resize and change color.

> In a nutshell, traditional non-PWA software cannot be written in a fully declarative paradigm due to their nature, every non-PWA app on the web is intrinsically a jungle of imperative code, making assumptions about accumulated state. PWA's are generally written declaratively, where all state and dataflow is explicitly managed, and output is deterministic based on a given input.

The assumption here is that declarative is deterministic and anything else isn’t and that’s just not true.

Determinism isn’t a function of the language, structure or paradigm.

If something is deterministic is dependent on the algorithm it has.


> If you don’t know how your code will work, that’s an issue with the quality of the code.

No, it's a fundamental difference between declarative and imperative programming. With imperative programming (e.g. PHP + jQuery) you cannot know what's in the DOM resulting from PHP-generated HTML you are working with, you need to either make naive assumptions about the DOM or litter your code with conditions checking the state of the DOM before operating on it. With declarative programming you declare the structure you want, so you know exactly what you have, because it's what you've declared. There's no need to check that the PHP backend rendered some element to the DOM to enhance with JS functionality, because I already declared that element is in the DOM in my JS directly.

> With a language like PHP, if you call a function twice with the same input, why would the function return 2 different outputs?

Again, this has nothing to do with languages and is purely about programming paradigms. What you're referring to is one of the possible "islands of functionally pure, declarative code" I previously mentioned. Pure functions in PHP can be deterministic, and pure functions in JS can be deterministic, but the overall application behaviour of how your front-end (jQuery) handles the output of your back-end (PHP) results in the application being non-deterministic. For example, say you have some form that submits data and on the posted page, displays a green thankyou message in a div. Say you want to animate this div message with jQuery. You cannot safely make any assumptions about the presence or prior presentation of the div, because from the context of jQuery you don't even know if the form has been submitted and the div rendered by PHP. You need to either: - Imperatively check the current state of anything you wish to change prior to changing it. - Naively make assumptions about prior state resulting in non-deterministic behaviour.

jQuery makes this easier by solving that problem on node "existence" with their selector-based API which simply doesn't run your function if the targeted nodes aren't found, but the problem still remains for general state, any jQuery code needs to either explicitly check the state of everything it touches (brittle code, hard to reason about and maintain) or make naive assumptions and be non-deterministic (which seems to have been the norm).

> The output of this is deterministic. Given jQuery works, the element will resize and change color.

False, it is non-deterministic because the ultimate outcome (what you see) depends on implicit state from side effects. For example if you toggle the resize control, the resulting display is based not only on your action (input) but also whether the colour change button was already pressed (an unrelated side-effect not a concern of the logic for resizing), or if PHP initially rendered it in a different size/colour, or maybe PHP decided to not render it at all because the user's session ended? From the context of jQuery you can't determine any of this, the only way you could come close to deterministic behaviour is by explicitly checking every single bit of stateful DOM before working on them, which I've never seen done in a large project (and would be hell to maintain).

> If something is deterministic is dependent on the algorithm it has.

Yes and any algorithm which has its outcome impacted by side-effects or unmanaged state (i.e. pre-rendered HTML/DOM from PHP, changes to DOM from other side-effect causing callbacks) is inherently non-deterministic. For an algorithm to be deterministic it needs to take every single thing that could affect it's outcome into consideration as input, and I have yet to see any jQuery apps which parse the entire DOM and use it as input to achieve deterministic behaviour

> The assumption here is that declarative is deterministic and anything else isn’t and that’s just not true.

No my point isn't declarative being inherently deterministic, rather that imperative is almost always inherently non-deterministic. By the nature of imperative programming you are making assumptions about your environment (i.e. there's DOM nodes to work with) which makes it intrinsically non-deterministic (those DOM nodes could not even be there)


I literally can't tell if your comment is satire or not. I hope it is. I fear it isn't.


It isn't. I elaborated a bit in my later comment: https://news.ycombinator.com/item?id=22523344


Well, I'd argue that old fashioned PHP sites are structurally really similar to modern declarative client-side UIs, in that in both cases your're basically just constructing a series of functions that (deterministically) convert the current state to HTML.

Client-side rendering without declarative framework (say jQuery or vanilla) is different, in that you also need to consider state transition. This gets complicated fast.

And of course, the trade-off between the old fashioned approach and modern declarative revolves around how much interactivity is required.

If you can get away with it, I'd say the old fashioned approach takes less development time because you don't need to handle asynchronous (and possibly failing) state syncs. The database containing any state you might want to access is available synchronously!


Yeah fair call and it's more the combination of PHP + jQuery I'm referring to as being non-deterministic rather than either on their own - you can definitely write deterministic PHP and jQuery code, but the implicit integration between them (PHP rendering HTML which becomes a DOM which jQuery imperatively operates on) is what I'm saying makes it non-deterministic. I'm mostly talking about the part you mentioned in managing client-side state using a non-declarative paradigm, where you either end up with a stupid amount of unnecessary complexity (conditions checking the prior state of literally any and everything you touch, I've never really seen this in a production project) or making naive assumptions about the DOM which means non-deterministic and potentially unexpected behaviour.

> If you can get away with it, I'd say the old fashioned approach takes less development time because you don't need to handle asynchronous (and possibly failing) state syncs. The database containing any state you might want to access is available synchronously!

Fair point but there's also a flip-side to this though. You can write a client-side PWA with all your async state and data persistence magically handled by a service like Firebase Cloud Firestore, so just writing the UI declaratively and not having to worry at all about servers or databases, which generally requires a lot more upfront planning (figuring out models/schemas to represent your data, service architecture etc) than just slapping a UI together and wiring inputs up to a production-ready persistence layer API in front of a schemaless NoSQL DB like Firebase.


I mean... not really. I wouldn't invest in building client-side applications unless they bought me something. My customers don't really care, the majority don't even know what an SPA is. They do care that our application is faster and has more features than our competitors.


What in JavaScript are you using to take the place of a server side database?

Your comment seems to imply that the only options are SPA or WordPress..

If you are building something that would normally need to be a desktop app, sure, an SPA might be a good idea. But this is still rarely the case, most web pages are just are just presenting data, and in those cases, the less JavaScript the better.


I'm not. If your application needs a database, then it needs a backend. An SPA isn't inherently a "backendless" application.


A better question to me is, what does WordPress offer that a static site generator like Jekyll can't?


A management system my photographer friend can use without learning what "CLI" or "GitHub" are. They just login and make changes.


Yup, exactly that. WordPress editors (WYSIWYG and Gutenberg) are still far more user-friendly and mature than a Netlify CMS that still is buggy like hell and where many little details and YAML formattings need to be fixed in code editor. You really cannot expect such a thing to be done by a non-coding person.


There's https://www.netlifycms.org/ if you want a static site generator with a CMS, FYI.


Are you aware of any open source applications as a CMC for static generators like Jekyll? Would be very convenient to have one with a a simple github login/auth settings page and then add/edit page/post options and the ability to select or upload a theme would be great.


NetlifyCMS works with Jekyll, Hugo, Gatsby, Middleman and a few others.

https://www.netlifycms.org/docs/jekyll/


my photographer friend can use ftp


my photographer friend can also add <img> tags to static html documents.


A huge variety of plugins, and a large pool of cheap developers with plenty of experience in the stack. I personally like Jenkins, but if I’m handing the site over to a low-tech client, I’d use WordPress hosted on managed WordPress.


Wouldn't do it any other way.

I dipped out of web dev in 2015 to run an ecommerce project. I stopped tracking all the latest js framework news. I settled on a barebones CSS rule set. I stopped choosing SPAs as the starting point for mvp ideas. I shelved Wordpress and moved some content sites over to Netlify and just recently started using a nifty desktop ssr cms instead of Hugo, Gatsby, etc.

I was regressing to the environment I understood in the early 2000s-2010s. I did all of this because I knew there would be a steep learning curve going into modern ecommerce and it seemed reasonable to put everything into hibernation while I attempted to go all-in.

I pretty much missed the entire React, Angular, Vue ramp up in the industry.

I've since shut down that ecommerce project and find myself peeking back to the js-verse every now and then.

I doubt I'll ever voluntarily use one of those frameworks for web dev.

I do, however, like this thing called, Svelte.


> I doubt I'll ever voluntarily use one of those frameworks for web dev.

> I do, however, like this thing called, Svelte.

Svelte is just another one of “those frameworks”, except with a vastly smaller community: https://trends.google.com/trends/explore?geo=US&q=svelte%20i...


Svelte components compile to vanilla javascript plus a small runtime (a dozen or so functions that track state for context and the developer tools). It's much lighter weight than React or Vue's runtime and a side effect is that integrating Svelte components into other frameworks is a breeze (especially if you compile to web components). The author refers to it more as a separate language.

I've had a great experience with it recently developing a smart tv app - which are dog slow even with a quad core ARM CPU due to the resolution so I decided to learn Svelte for the project. The big downside is another templating language to learn and as you said, the community is tiny, but the surface area is small enough that it hasn't been a problem regarding tooling or debugging. Pulling in other frameworks into a svelte app is also relatively easy so it can be part of a slow incremental rewrite, although as always it devolves into a webpack/rollup config mess.


> Svelte components compile to vanilla javascript plus a small runtime

Same could be said of React though. React components are literally just plain JavaScript functions which return VDOM nodes that get diffed and applied to the DOM by the runtime library.

The only objective advantage I see to Svelte is it's tiny footprint, which if you really need you can already get by using Preact (3kb vs Svelte's 3.5kb) without having to learn yet-another-completely-arbitrary-templating-language to no benefit.


Svelte is actually quite a bit different. Quoting a random section from their docs

"[Svelte is a] compiler that knows at build time how things could change in your app, rather than waiting to do the work at run time"

My point being there is a large amount of compile time code transformation at build time. The run time is not so much a framework as a library like libc. In Svelte when you update state there is no virtual dom, dom diffing, fibers, or anything. It would have been compiled into a direct dom update of anything depending on it. It's no longer a Svelte app just vanillajs. There are some details or course but it's a different paradigm than to react or other modern UI frameworks.


And without a virtual dom and all of its overhead.


Virtual dom isn’t that expensive compared to modifying dom. Svelte isn’t magical, it’s still diffing objects to figure out whether to change some dom element’s attributes

With react memoization and pure components, you get very similar benefits.

Sure svelte goes the extra step and removes unnecessary diffing of constant attributes that will never change and other opts. However there are tradeoffs


[flagged]


>When did software engineers become hipsters?

Hacker News is the epicenter of software engineer hipsterdom, come on.


That's true. I guess I'm just frustrated by that. I suppose I should just accept it and let it go haha.


It seems like a lot of web devs are either stuck chasing novelty or constantly trying to bet on the next big thing.

For example, what happened to CoffeeScript? https://trends.google.com/trends/explore?q=%2Fm%2F0hjc5m0&ge...

What happened to Ember? https://trends.google.com/trends/explore?geo=US&q=%2Fm%2F0s8...


TypeScript ate coffeescript. That said it did inspire a lot of new features in both TypeScript and the ES standard.


Ember is still going strong, Octane is a fantastic new release and it’s been awesome to un-learn all the ember specific things in favour of all the new vanilla JS ways of doing things that Octane makes possible.

About the only negatives I can bring up are IDE (and other sorts of vendor tools/components) support/integration is not as good as more popular frameworks like React, but that’s really just to be expected as vendors tend to target based on popularity.


Coffeescript was 'replaced' by ECMAScript 2015. Ember is fine but it's not the end of the line - it's been superseded' in most scenarios by React and Vue.js because those are generally better tools.

JS technology changes quickly not for its own sake, but because there are still discoveries to be made about how to improve both syntax and how we think about code. I'm sure the pace will taper off sooner or later, until the Web is taken over by something else.


It’s also because JS raced to integrate features which were considered standard elsewhere, back in the dark ages with anonymous self executing functions.


I think you're right. But I think that's because it's still early days, nobody has really nailed it yet. We're all trying to make sure that when something sticks, we'll be ready for it. It seems like TypeScript and React are sticking around pretty good. I'm sure they'll be replaced at some point. My personal hope is that they'll be replaced by better standards within the browser.


CoffeeScript largely got killed by EcmaScript 6.

Two reasons why

1) Some CoffeeScript features, like arrow functions and classes, made it into ES6. 2) CoffeeScript was never updated to properly generate ES6. Largely because the features that got adopted by ES6 had subtle differences that made the migration path for existing projects highly complex.

I think, to a lesser extend, the rise of TypeScript also had something to do with it.


I made a web app framework [0] for people like you (and me). It lets you use static HTML to build a web app.

Data attributes store data in the page and add dynamic features (like CRUD functionality and drag and drop sorting) very easily.

Plus, the whole thing is server rendered with Handlebars, so the front-end JS part isn't even required for 95% of visitors (it's only necessary if you're an admin who can edit the current page).

I wrote it because I missed the simple days of jQuery and REST APIs.

[0] https://remaketheweb.com/


Interesting. How are you deploying the server side environment?


It's an open source Node.js app, so you can deploy yourself it to any host with a persistent file system. No 3rd party database required -- it uses a file-based architecture to make it easier to get started with and modify the data. Its goal is to be simpler and easier to work with than anything else out there.

There's also a built-in CLI that can deploy to our free hosting service, which is built on top of Digital Ocean.


> recently started using a nifty desktop ssr cms instead of Hugo, Gatsby, etc.

What are you using?


Publii. It's developed by a European (France, I think) team. It syncs to Netlify with a single mouse-click.


+1 for Publii. It feels like it's early days for them (i.e. they only have a few themes & haven't been picked up by the third-party theme developer crowd yet), but man, does Publii make it easy for any end-user to generate and publish a static website.


Thanks!


Good. I could do without the JS, to be honest (I'm a little tired of pages that refuse to show me anything unless I whitelist some stupid third-party jQuery script in uMatrix), but, baby steps.

When sites finally ditched their trendy Flash rewrites and went back to HTML, I thought, thank God.


When flash was dying I said it was a bad thing because ads would no longer be contained in these tidy, controllable virtual machines sitting adjacent to the page content but instead would be redone in JavaScript in a huge soupy mess where the content is held hostage by ads that slide in and out of the screen and cover content, popping in and out of existence, reflowing the page, making the content jump around and disappear as terribly coded flash apps become terribly coded JavaScript modules slogging the whole browser along with it.

People thought I was wrong and insane. I wish I was. Realistic assessments are always less popular than hope and principles. Flash was the good times.


Flash ads were trivial to block with plugins :-). And it was safe to block because usually no serious pages used Flash much except for artist portfolios.


The JAM stack the author is talking about is entirely driven by JS. The is “static” in the sense that a .dmg file is static and can be hosted on a CDN and not templated on the server.


[flagged]


This is just empty negativity.

That people use Javascript in places they aren't forced to use Javascript should make you wonder what the upsides are for them instead of assuming they cannot form any valid preferences of their own, unlike yourself.

For example, Javascript is one of the few languages that actually delivers on "async everything" and has a very simple Promise abstraction (like Promise.all()), and that makes it better for me than building networked services in similar languages (PHP, Python, Ruby). It's also one of the few dynamically-typed languages with a bolt-on static type system that actually gained traction.

You might not care about these things somehow, like maybe you only write C and Python because you work with embedded systems. But I think it's time to stop calling it garbage in honest conversation.


Async-by-default from the perspective of the developer is one of the chief sins of JS, in this writing-JS-since-the-90s developer's opinion. I've seen and written so damn much code to work around async-by-default. Now most of the code I read is comically full of (justified!) awaits, so at least we got the sugar to fix it without so much fuss, but we shouldn't have to in the first place.

Let async code surrender control, but don't continue executing the current logic unless the developer asks for that behavior. Await-like behavior by default. That would have been much saner, less bug-prone, more helpful, and easier to learn & reason about. It's the desired behavior at least 90% of the time, in my experience.


Maybe the problem here is building networked services (sounds like synchronous communication between components). How often is it that a well architected microservice architecture needs to do more than push data into a decoupling resource and data store and consume data from a stream or a queue? Is Promise.all( ... ) really that relevant that often?

I've come to enjoy Node.js more than I thought I ever would, but it's still not something that's ever mindblowing to me, and I don't think that switching to Python would be a huge burden. With sensible tooling and contracts, neither would switching to a statically typed backend.


I’m not a JS fan but quit with the toxicity. The proprietary logic is done with languages like Go, Rust, Python, JS, etc and built with a RESTful API in mind so the frontend can consume it. Angular, React, and Vue are built to do this well. It’s all in moderation and should be done responsibly. The downside with this of course is added complexity on all levels. However with this design maturing the complexity has reduced some or effort offloaded elsewhere (microservices, serverless, k8s.)


shovel sellers, that's why


Be interesting to see if we could start charging website owners fees for them using our processing power to render their site.

Something needs to be done to encourage some efficiencies given the fact it takes more resources to surf the internet and read about running a k8s cluster than it does to actually run the cluster


You're free not to use any website...


Arguably if you are publishing content on the net you are giving it away.

There are mechanisms to charge for it.


> There are mechanisms to charge for it.

Only above a certain price point. Our financial system nor our consumer culture support micropayments yet. This effectively limits the web properties that can charge users to a small fraction, so unfortunately "just charge money" isn't the widely-available solution I wish it was yet.

Hell, it's more viable to charge users for your mobile client than the backing service. It's kind of the closest we have to micropayments in a way.


Then you dont have a sustainable business model.

Why should I subsidize your failed model ?


Yet you somehow think charging content providers for using your computer when they give you free content isn’t a failed business model?


If someone gave you a set of unassembled Ikea stuff and you didn't know if you could use it until you assembled it.

Who should pay for the assembly of the stuff ?

Would you put it together for free just to decide ?

Who pays for the used resources ?


There are some interesting trade-offs in this.

There is more JS being processed, but that JS is smartly split up, bundled and takes care of loading just the content you need on subsequent browsing.

So the gains are:

- less data sent over the wire overall

- faster page loads overall initially and especially on subsequent loads.

- on subsequent loads only parts of the DOM are being rendered, not the whole thing so in some cases it might save you processing power

- no round trips to a DB and template engine, this is mostly a pro for the provider of the content

- some sites with this pattern enable you to download the whole thing as a PWA, so you can look at it offline with a single download.


Would you pay their fee for generating the page on the server?


I do

Netflix, Prime Video etc etc


That's paying for content, but isn't really what we're talking about.

Hacker News is a better example. It's generated entirely server-side. Why not pay YCombinator a few cents for every page load?


If HN had content worth paying for they could charge for it.


Hell, I still write the HTML and push it to Github where Github Pages and Cloudflare put it up on the Web for me. I've never seen any reason to use Markup when I can just Mark my text Up in HTML pretty simply. If you write a site correctly (a site, not a webapp) it'll render in anything from the original Netscape to the Wii Browser (tested working with my site), Internet Explorer 8 (working), Links2 (working), and the latest Chrome.

A site like YouTube doesn't need to work with old stuff like Netscape or IE but if your site primarily focuses on text there's next to no reason there should be Javascript on your site. Sites like danluu's[0] and Michael Norman Williams'[1] may not look the best but they just work.

My website: https://www.instantfloppy.net/ (though admittedly I don't spend as much time on it as I should)

[0]: https://danluu.com/

[1]: http://michaelnormanwilliams.com/


I have no idea what your site is about. Its just random links with no explanation about what they are or for.


> (though admittedly I don't spend as much time on it as I should)

It's pretty much just a howto and link repository for myself and my friends right now. Still mostly a work in progress. Consider it as an example for look rather than for function.


I’m sorry to say this but your site may only be appealing to you. Even though it will render on Commodore 64.


The intent is for users to style it themselves.. or it was, until browsers stopped having that functionality.


Browsers never stopped having that functionality. It's still entirely possible to load custom user styles into a webpage.


If I recall, Netscape or one of the other early browsers used to have an option to change the default font/color of text and the background in its settings. Nowadays you have to use an extension to do so.


I finally found an option in Firefox to change things around, but it still only changes the font and not actual foreground/background colors.


> Hell, I still write the HTML and push it to Github where Github Pages and Cloudflare put it up on the Web for me. I've never seen any reason to use Markup when I can just Mark my text Up in HTML pretty simply.

This sounds like you're making everything more difficult for yourself with little to no gain whatsoever.

With Markup or Markdown, you're still writing that HTML, but you're writing it once and then using automation and data structures to abstract it away. After that's implemented, and it's already been done for you (Hugo, MkDocs, ...), you're just writing the actual content and using simple syntax to format it. Everything else is then handled for you.

I wrote this recently: https://www.thecloud.coach/terraform/understanding-state/

If I had to write that from first principles, as you're doing above, I simply wouldn't have.


> The Dark Age - Somewhere on this path to render pages on the fly (SSR) and render pages on the client (SPA) we forgot about the performance of our webpages. We were trying to build apps. But the web is about presenting content first and foremost!

Pfff - That's completely wrong. SPAs are all about performance. If you want to built a highly-interactive site, it makes sense to do the computing where it's consumed -- in the browser. And hey, if you use a lot of the same logic and data models in the browser as you do on the back-end server, it's only natural to want to try to centralize the code. DRY. The problem is basically what @s_y_n_t_a_x said: developers got carried away with it and starting using SPAs for everything, including sites with low interactivity like blogs, which turned out to be less performant.

> But the web is about presenting content first and foremost!

Says who? I understand that the underlying structures of the web are all about transferring documents and other resources. But Gmail showed us that web 'apps' are useful. Why shouldn't we pursue that? Just because the original web architecture didn't account for it?

In the end, the article makes the right point: Use the right technology for what you're trying to build.


> Use the right technology for what you're trying to build.

Yep .. so if you want a rich responsive user experience, DON'T use any web technologies. GMail (AND Gsuite) is a perfect example of how a relatively simple concept like email can be turned into a slow, unresponsive piece of garbage.


I'm about 99% sure the store on the PS3 was webtech. That beast of a machine could barely run its own store. They lost a lot of money from me as a result of the store being super-slow, input laggy, and crashy (OOM I assume), and I can't be the only one.

PS4 feels like they've taken webtech all over the OS interface. Its store's snappier than the 3 but the rest of the interface performs way worse. Usable, but far less pleasant.

[EDIT] store should be snappier than the 3, mind, since it's way more powerful hardware—it's still kinda slow, considering.


All sorts of devices with some kind of UI are using a browser in some kind of kiosk/headless mode.

STBs, Smart TVs, IFE systems are or were web based.


To be honest as time goes on the Smart piece of your device starts to get slower and slower.

One of the main drivers for getting a amazon fire stick is that it is much more responsive than the Smart TV itself.


I thought that but the Youtube app on both my smart tv and Apple TV are broken in exactly the same way and unusable without a regular reboot.


Previous versions of GMail had a good experience but was significantly faster than the current version of gmail. Loading basic HTML mode can get you a previous version of it.


Extreme statements aren't useful.

GMail is a poorly built app, that's all. There are plenty of great examples. Both Outlook and Fastmail have fast and feature-rich email clients. Gmail was good back in the day and got carried away by project managers and feature creep.

Meanwhile Google Docs continues to be a pinnacle of what you can achieve on the web.


My experience with the Outlook web app (not the "basic HTML" mode, which is perfectly usable in contrast) is the exact opposite --- it's extremely slow and consumes a ridiculous amount of memory for what it does (I've seen it take over 2GB of RAM, and this is with an account where all the emails with their attachments total less than 100MB.) When composing a message it lags so much that it will delay each keystroke by several seconds and drop keys intermittently, and I have resorted to writing in a real (native) text editor and copy-pasting. In contrast, the native client has memory usage in the dozens of MB and is far more responsive --- I've never experienced it being sluggish to that extent.


What I wish for every Christmas is that web developers across the globe finally learn that syncing a draft server-side onkeyup is the worst idea, and it should never be pursued in implementing it.

Reality has rtt and shitty 2G slow, and will drive people away from your web product if you are too silly to cache things locally.


Unused RAM is wasted RAM. If you don't have anything else that needs it then just let your system automatically handle it.

Also I find most performance issues with big apps are a result of browser extensions that interfere. Try using a private window without any extensions.


>Unused RAM is wasted RAM.

Says the OS developer, says the browser developer, says the webapp developer, says the developer of whatever else you have running. Developer time is expensive. Says one. Look it runs fine and is snappy. Says another on his maxed out development pc forgetting about his grandma. Why should i care about that extra memory load when most pc's nowadays have x amount. Says yet another.

And so the slugfest continues.

>Also I find most performance issues with big apps are a result of browser extensions that interfere. Try using a private window without any extensions.

Browser extensions can definitely make a browser sluggish but most don't interfere with the content. The only ones that do AND are common are adblockers which have a tendency of making it less sluggish.


Vacuuming up RAM while all the other applications have to fight over the leftover scraps or wait for paging to get the data from disk (1) is NOT a good thing as much as Electron-zealots want to normalize running a multi-GB browser instance for every single application.

(1): burning through an SSD's limited number of write cycles OR being at least an order of magnitude slower to access in the case of HDDs


Who says it was otherwise unused? Web apps aren't used in isolation, the additional RAM you're using to speed things up by 3% has caused my code look-ups to slow down by 30% because you've consumed what was my file cache.


> Unused RAM is wasted RAM.

Bloat RAM is also wasted RAM. If it's not being used as a cache of reasonable size, or in a time/speed tradeoff, all you're doing is making things worse.

> If you don't have anything else that needs it then just let your system automatically handle it.

That's a statement that only really applies to people misunderstanding RAM used by the page cache or suspended programs. And both of those depend on active applications not allocating that memory!


> I've seen it take over 2GB

Probably because that RAM wasn't doing anything else. Why not use it as cache to speed things up. I'm sure it can work fine in much smaller RAM sizes.


It's very rare that a program intelligently adjusts its cache size based on the amount of memory available.

And that doesn't explain how the "cache" is twenty times bigger than the source data.

I bet with less RAM available it would start thrashing.


The fact that my machine was swapping heavily whenever I tried to switch to the Outlook tab or do anything with it says otherwise.


Well, I'm all for SSR for most use cases, but Gmail is that way not because it's an SPA. For example, Fastmail lives up to its name and it's an SPA.


I want to emphasize that I was talking about the first beta releases of Gmail, which is where XHR was born. Or so I read.


> DON'T use any web technologies.

How do you think that would have worked for Gmail's success?


When did gmail become slow. It pretty fast for me. If speed is slow then yeah.


> But Gmail showed us that web 'apps' are useful.

And we really have come full circle. Gmail performance, at least on Firefox, is awful.


Its performance is so horrible that even on a high spec machine with fast internet, you have plenty of time to click the "switch to plain ol html" button. Which is actually more in line with the performance I'd expect from a glorified file drawer.


I have plenty of time to hit "load basic HTML" on damn beefy machines running on Google Fiber (LOL).


Loads instantly for me on FF desktop and laptop running linux. But i have experienced what youre talking about.

You either have graphics hardware support disabled or youre laptop CPU is not running in perfoance mode and has its govenor set to something like balanced or power saving mode.


...why is a web-based email client depending on graphics hardware?


The same reason your desktop based email client probably depends on it. It’s 2020 and developers are starting to make use of all the modern browser features. It’s time to stop thinking like the browser is some fickle thing that needs to be caressed and treat it like a sandboxed OS environment for delivering apps and content.


We had rich desktop email clients - with more features than GMail - running on sub-GHz CPUs with integrated graphics. They also had better UI.


Obviously native code is faster than a generalized scripting language for the web. Again, it's all about what you're trying to achieve. You can still use those fast native clients, but you can also access your Gmail from any device that has a web browser, which is basically everything. If Gmail is slow today, then obviously Google has lost sight of that point.


I've never found a way around it. Hardware acceleration on/off, windows, linux, phone, workstation, ff, chrome/ium, it always loads like crap. And my daily drivers are a dual xeon workstation or an overclocked zen3 machine, pretty far from laptop constraints. Gmail just has no interest in being performant, I guess.

I don't have this issue with other sites, it's just Gmail being Gmail.


I don't think it's graphics support, but I have the same experience: The vast majority of the time, gmail in firefox loads in fraction of a second, too fast to click "basic HTML" view. And I'm not on something like fiber, just basic cable internet.


Whether or not this is true, when Gmail came out, it was WAY better (including faster) than everything that it competed with.


It was never faster then squirrelmail. It had cool features search was fast. But loading up your mailbox never as fast as other solutions.


I was considering things like Hotmail and Yahoo mail to be competitors. Free, ad supported, web based email. Was Squirrelmail that? I don't know I've hardly heard of it.


Squirrelmail is a GNU licensed webmail client. If you get a domain and hosting and they provide webmail for your domain, it's likely through Squirrelmail or one of a handful of competitors


It's of course perfectly fine using chrome, which may or may not be indicative of a serious problem for the open web


I was talking about the early Beta releases that caught on like fire. Specifically with their use of XHR. That doesn't mean that web apps are always slow and bloated.


> Says who?

Everyone who isn't only on the web to try to make money. That's who. But all you for-profit types are runining it. As the browser becomes more powerful it is more important to secure it. As security becomes more important features are removed and the browser begins telling the user what they can and cannot do. And on the webmaster side it becomes infeasible for browsers to display your site or search engines to index it unless you depend upon the leased whim of third parties to get your centralized authority signed SSL cert.


> But all you for-profit types are runining it

As someone who loves the internet largely due to all the cool things people have made for profit, I can’t imagine how they for profit people have hurt the internet more than they have helped it.

You and I wouldn’t be talking right now if it weren’t for a company making hacker news for profit.


> I can’t imagine how they for profit people have hurt the internet more than they have helped it.

By turning the Internet into what is first and foremost an ad delivery system. Everything else stems first from the need to show people ads, track their online movements and ensure they are spending as much time as possible "engaging" with your "brand".

Why is it slower than ever to load a page with text and images? It's because of the invisible fatberg of ad networks, A/B testing scripts, heatmaps and analytics trackers sending and receiving requests, all of which have zilch to do with actually sending a page from the server to the client.

"All the cool things made for profit" includes the real-time bid matching system without which websites like Youtube wouldn't exist, because it wouldn't be worth the expense to Google or to some other megacorp.


That's not true we would be talking on something else.

In fact, the forums of yore were much better communities than places like Hacker News. And sadly Hacker News is probably one of the better communities on the internet today.


Ditto. Intrusion of for-profit entities into online communities have well and truly killed the spirit. I base that on the fact that being exposed on any online destination where people congregate these days such as, Twitter, Reddit, hacker news, Facebook, for even a few minutes a day results in me feeling tired, demotivated, sluggish, angry, confused and just generally down. This is in stark opposition to how hanging around in communities of yester year felt. Further, I'm 27 so it's not like I'm some old veteran that yearns for the 90s or something but the quality has palpably gone down year after year. Sure there are small enclaves within these behemoths that somewhat manage to preserve a pleasant community but they're increasingly hard to find and only a matter of time before they too lose that magic to effects of profit driven actions.

Welp, on we go to the next post to read opinions of a nauseatingly self-assured tech nerd, who makes an above average living, declare everyone beneath as failed experiments of nature based on supreme principles of the free market..


Ha. Who says you shouldn't try to make money on the web? This is a deeper argument about capitalism and exploitation of resources. Practically speaking, capitalism will always find ways to make profit from exploitable resources. If you don't like that, you have to change society.


I'm fine with that. I'm not fine with browser and HTML/etc spec being defined by for profit entities now that the w3c has been marginalized. And I'm not fine with the changes they are requiring.


>Says who? I understand that the underlying structures of the web are all about transferring documents and other resources. But Gmail showed us that web 'apps' are useful.

I have always thought Gmail wasn't really an App, and shouldn't / doesn't need to be an App. There are lots of so called Apps are nothing more than some documents with some interactivity. But people start using Javascript for everything. I hope Hey.com will prove this soon.


> Pfff - That's completely wrong. SPAs are all about performance.

Why are they consistently slower than especially when there is low bandwidth or bad reception (lost packets)? Your large JS script has to be downloaded and parsed by the browser. This is fine when you have high bandwidth and low package loss. But if the opposite is true that initial hit to get all your JS code can be really painful. Writing really fast JS is hard and is not normally something that can be done in a framework. I know React is better than things like Angular, but there is nothing faster than vanilla JS.

I write a lot of pure JS and I write stuff to be very fast. However it requires more effort and a higher skill bar (to the point where I am weaker at server side development and terrible with databases).

A lot of SPAs can be replaced with with a post back mechanism and some caching of assets (CSS etc) and a sprinkling of Ajax. The less you have to download, parse etc can really make things less painful when you are in that situation.

The problem is that many developers don't bother to learn JavaScript properly so unless you give them a framework like Angular or similar you end up with the jQuery spagetti of the past.

> Says who? I understand that the underlying structures of the web are all about transferring documents and other resources. But Gmail showed us that web 'apps' are useful. Why shouldn't we pursue that? Just because the original web architecture didn't account for it?

Documents and other resources are content. It is about displaying content in a meaningful manner to the user.

If it is something that the browser can't render (word docs for example) you need to deal with that appropriately.

Sometimes a plain page with some links to download your files is more than sufficient. I used the old gmail lite version for years because it was faster and did most of the same things. If you gave that interface some nicer CSS most people probably couldn't tell that it was quite old (if done right).

> Why shouldn't we pursue that? Just because the original web architecture didn't account for it?

Generally the browser already knows how to deal with HTTP and the OS knows how to deal with documents you download (if it has the right programs installed). So I think generally things just end up better when you try to keep as things were intended to be built.

I do however have the attitude of "the water takes the shape of the container" attitude to development. Whereas others don't.


> But Gmail showed us that web 'apps' are useful.

Gmail is terrible. It's slow and most people prefer the static html version.


> It's slow and most people prefer the static html version.

The only people I've ever heard talk about the static html version are a few HNers. I'd wager 99% of people don't care, don't mind their email client's first load being slow (they just keep it open in a tab anyways), and don't even know about the static html version.


> The only people I've ever heard talk about the static html version are a few HNers. I'd wager 99% of people don't care, don't mind their email client's first load being slow (they just keep it open in a tab anyways), and don't even know about the static html version.

Are you claiming that because people are ignorant of an option that is clearly better for them, the option is redundant? So if people don't care about something, it's not better by definition of that fact?


Exactly zero everyday people switch to HTML-version. I’ve never seen anyone doing it in the wild. This sub-thread is just peak Hacker News.


> I’ve never seen anyone doing it in the wild.

You watch people as they use Gmail? You actively ask them?

I guess if YOU haven't seen anyone do it, no one must be doing it.


I would say that with the rise of mobile, most gmail users are using the Gmail app or their native Mail app.


Most people know Google’s webapps are slow, they just don’t know how to do anything about it. Or they notice it as worse battery life and fans running constantly and think there’s something wrong with their laptop because they don’t realize a web app can cause that, or even know what a webapp is.


> Most people know Google’s webapps are slow

Can you substantiate this with more than just your personal experience? This doesn't match my experience outside of the HN bubble.


Of course it's just my personal experience, but almost all of that's with "normies". Teachers (they live in Google webapps), writers (=avoid because terribly slow and input-laggy on low-end hardware, which ought to be all you need if you're trying to make money writing rather than impress people at coffee shops—and anyway it's not a ton better on good hardware), various other non-tech folks. The ones who don't complain about it do complain about its effects on their machines, without realizing that's what's doing it.


As much as I advocate for web apps, sadly I have to agree with this, gmail is maybe one of the worst examples to use. Especially once they launched the new experience, the performance has been god awful. And frankly thats part of the reason I refuse to use Angular if the company that designed it cant even optimize it properly.


Somehow everyone I know is happy with GMail and find a full-featured email client incomprehensible. I can't stand GMail's slow UI.


Try FastMail. Its beautiful and refreshing. "Old school" in everyway.

If anyone from FastMail is reading this - do not change your design or UI framework...like ever. The moment you start slowing things down with animations and bloat, that's a slippery slope...


I don’t understand how anybody can like gmail’s interface. It’s so convoluted and unintuitive.


Many people say the same about the traditional Outlook style clients. Maybe it's just a matter of what people are used to and nothing logical?


Quite the opposite, it does its job pretty well for me. It has good keyboard shortcut support, and the UI is only lagging if I go into 80-100 long email threads otherwise pretty good.

What's unintuitive about a list of emails and few simple features? You also bog my mind :)


I don't like how it is formatted for starters. Also the reply button is impossible to find- since its not at the time like it logically should be. Reply all and other options are hidden instead of being exposed. When clicking on reply, it takes me to the bottom, away from the content so its impossible to read and response.

It also doesn't have proper mailboxes, and rules are impossible to make.

I do not like it at all.


Your critique doesn’t really carry a lot of facts but mostly your negative opinion. New Gmail isn’t the fastest but it’s interface is pretty much OK and its rules are great.


When critiquing an interface, its gong to sound a lot like opinion. Its still facts that its not a good UI.

I don't understand how you can say "its rules are great" when I can't have two different message subjects go to the same mailbox.


You've clearly not had to work in a corporate environment on an old PC with a locked down copy of Outlook...

Gawd.. now that's terrible.

But yeah... I don't agree Gmail is terrible, but I'd like it to be better.

Ive been on a few months after it launched, and got a service that's been reliable and stable for that long, I have to give it a lot of credit!


Outlook 2010 is fantastic


Outlook’s search function is decidedly not fantastic.


The trick is to never sort e-mails into folders... just leave everything in your Inbox and let it index it. I know it sounds terrible, but I have 25k e-mails in my inbox (I've kept all since 2018-01-01) and I can search and sort in milliseconds


Update: OK, OK. I'm getting a lot of responses saying that Gmail is a bad example to use, because it's slow and bloated. I've responded to a few of those comments, but I think it should be addressed on a higher level.

To clarify, I said "Gmail showed us", because from what I've read, the early Beta releases are where XHR was born, which is what I view as the birth of web apps. It's the idea that you don't have to pull an entirely new document from a server to update a small part of the view, which translates into a more seamless experience typical of native apps. I think that transforms the web from a simple document transfer system into something capable of behaving like an application.

I think the explosion of JS frameworks since then are about how to effectively manage the use of XHR to update the view.


Gmail is literally a document browser. I think Google Maps might be a better example.


Amen. I feel like no one remembers the days of Mapquest where you'd click "next tile" and then watch the page reload just to scroll a few miles north, then repeat.


Google Maps is definitely a great example. I chose Gmail as my example, because from what I've read, that's where XHR was born, which is what I view as the birth of web apps. It's the idea that you don't have to pull an entirely new document from a server to update a small part of the view. I think that transforms the web from a simple document transfer system into something capable of behaving like an application.


Microsoft developed XHR in 1999 to support Outlook on the web.


> I find it fascinating that we are back to generating separate HTML/CSS and JS files and then putting them on a static file server — the CDN. It has been a decade long effort and as we come back to where we started, I feel like we are at a whole another level (a spiral?).

"The wheel of reincarnation" from client to server and back: http://www.cap-lore.com/Hardware/Wheel.html


No, the SPA hype has warn off and people realize that although these new technologies exist, they are not always needed.

If you are building a web app, use React or similar.

If you are building a web site, use plain HTML+CSS w/ supplemental JS.

Sometimes your project can be split in half. Do the landing page, signup, login, privacy policy, etc. in plain HTML.

Maybe don't embed all that stuff in your app and you wouldn't struggle with forms and routing, something that browsers can do very well with no programming.


> If you are building a web app, use React or similar.

I disagree with this, there's no reason to use React/Vue/Angular for a web app by default. And I would urge people to try and avoid it for as long as possible and see how far you get. From personal experience I can tell you that most of the time, you don't need a frontend framework to build a web app.


From my professional experience throughout the last decade, you're wrong.

I urge people to practice and use the best tool for the job, don't avoid something because some hipster thinks it's too popular.

Feel free to go back to jQuery spaghetti code, it's nice until you build something too big.

React isn't a framework btw, it's just the view. You can stack anything or nothing with it.


Of course the reasonable answer gets downvoted. Stay strong! And remember that you're not a good developer unless you write everything in vim compiled for an OS that you wrote for yourself in C, backed up on a trusty old 2MB platter drive via a series of rsync scripts. I heard the concept of version control was originally conceived and canned by IBM in 1967, because it was found that version control would eventually become mainstream, and therefore bad.


Thanks for the kind words and the mention.

I'm used to the downvotes on here unfortunately, it's easy to tell which topics will do it, the group think here can be thick...


You're welcome.

My struggle on HN is to try to ignore the groupthink and keep the sarcasm out of my tone, but sometimes I just can't help myself.


Preach. I teach a little "into to web dev" workshop that starts with a slide explaining that static sites, SSRs, and SPAs are all perfectly acceptable. They're different tools for different use cases.


It’s a nice title but webapps have evolved from documents to application bundles.

It’s super cool that we built wildly complicated apps out of what was in spirit an FTP client with a PDF reader but it makes since that we have to hack the document retrieval system less now that the client can run code.

Maybe one day we’ll stop hacking on the document store’s limited RPC-ish protocol too.


"All of this has happened before. All of this will happen again."

I was recently considering that we went through this cultural shift with computing: we started out by having a system to run code onvdistributed to terminals, then to actual desktop computers, then back to remote with the cloud, and now pushing out to "edge" computing. I'm sure it's more nuanced than that, but we seem to do these cycles, refining our processes.


It seems the trend points to slower and slower websites with more JS bloat. I also notice a increase in JS-errors causing entire sites to malfunction, recently this happened to large apps like Teams and Float


The trend in my work is to add more and more JS to get the render time below 11ms per frame (11ms for my code, plus 5ms for the browser to paint and update, for a solid 60fps). To do that takes a lot of data structure design, memoization, cache management, plus a decent understanding of how browsers put things in the DOM and on the screen. Admittedly I work on something that's a bit unusual (a diagram tool for lawyers.. kind of like Visio-in-a-browser), but I can assure you that if you want robust, fast rendering it takes a lot of code to do it well. It is not bloat.


Why would you ever build that kind of application using web technologies?


Simple: every machine has already installed the gateway to your program: a web browser. No need to deal with issues like antivirus going bonkers, weird MSVCRT versions installed leading to support requests, you save a ton of money when you need to support OS X or god forbid Linux, and last but not least, an SaaS business model has recurring revenues while it doesn't have to deal with rampant piracy at the same time.


There's a weird subset of HN that think people should still be building desktop apps. It's like they haven't paid any attention to the software market for the last 10 years.


I for one would love the return of the good old days, where I could pay once for $software, have it on a physical medium and could work with it as long as I managed to have a computer/VM that runs it.

So much culture, especially in gaming, will be inevitably lost with the SaaS and streaming trend. For media like video, music and books at least many national library laws require the deposit of two or more copies at the national archives, but interactive media? Games that require online activation? That's all gonna be lost forever.


A browser gives you robust networking, caching, rendering, and security out of the box. SaaS means updates and versioning aren't issues. Online means I don't have to consider network installs behind law firm's firewalls. I'd prefer not to have to touch any of those myself even if it means I have to deal with JS and SVG rendering performance (which is actually really interesting).


This is a minor inconvenience for things like reddit, but the trend has also arrived at the webportal for my bank ("upgraded" to some sort of React thing, meaning it now takes 10x longer to load) and my cell provider (site no longer works in Firefox as of last month; error message: "check that you're connected to the internet!").


New reddit is painfully slow to login. So much javascript for a simple dialog to load it's absurd.


Are you aware of https://old.reddit.com (old desktop ui), or https://i.reddit.com (old mobile ui)? Also https://www.reddit.com/.compact


As long as old and compact reddit are still there, I'm fine with it. Twitter on other hand has no lightweight version, unfortunately.


Twitter's redesign is one of the most performant SPA's I've seen in a while. It's really smooth and nicely designed too. A lot better than their previous webapp.


When I open a Tweet I look at spinner animations for 10 seconds. Scrolling is also the opposite of smooth.


Scrolling is silky smooth for me and spinners go away quickly.

What hardware are you running?


For their infinite scrolling, they clean the DOM above and below the viewport and only keep a certain amount of tweets rendered. So once you start scrolling a little faster, the JavaScript can't keep pace and starts sputtering.


Do you know any others? Every time I’m redirected to twitter on mobile I’m in awe, it’s genuinely the only SPA I’ve ever experienced work “properly” on my phone comparable to a native app without any polish.


I think I recall someone mentioning there is one that you get if you send an empty User-Agent.


If JavaScript is disabled in your browser and you visit mobile.twitter.com, you get a link to proceed to "legacy Twitter", which is their pre-React mobile site.


> But the web is about presenting content first and foremost

This is where the article started going wrong. I don't care about the web -- it's a tool for me to do my job. I use the web because it's a much better alternative to shipping a Swing app. I don't care what the web is "about" and neither do my clients.

Also I do think it's funny the JAM stack he's referencing is just the jQuery stack of 5-10 years ago.


> And so was born PHP, it feels like a natural extension to HTML itself.

My ageing memory may be failing me, but wasn't Perl the first widespread language to be used for CGI ?


Well, C was there first in that sense, but yes, you're broadly correct. But that misses the point. PHP was developed and released as a templating language - something anybody with a bit of basic HTML knowledge (and, frankly, all HTML knowledge was basic HTML knowledge back then) could use to build an interactive site by using HTML-like tags to employ code written by somebody else as widgets in an HTML page. Perl meant writing the whole thing, including emitting HTML from the code.


The leap from static HTML to PHP misses a big chunk of dynamic web site growth that started to produce a lot of the bad patterns that lived on.

As simple and nice as static HTML is, it's inefficient. The reason we care about this now is mostly:

A) A lot of people just hate Javascript. And a lot of that hate is understandable. B) People still worry about the crawlability of their SPAs

JS has a lot of problems but it's also fixing some of these by growing really fast (paradoxically, also one of its problems).

Bots will get more comprehensively better at rendering client-side only apps.


I’m by no stretch of the imagination a modern web developer. I know plain JS well, HTML and enough CSS/Bootstrap to throw together an internal website.

Even then most of my experience is with server side rendering.

However, for any large project, I would much rather be forced into using a modern framework than not. Any web app that is large enough is going to have some type of bespoke framework that was created by the “architect” who has been there forever.

It’s just like dealing with a bespoke ORM, logging, or authentication/authorization framework.


Plus shifting processing from backend servers to end user browsers can be a huge plus of you have a decent amount of visitors. Also, just simply pushing my code S3 is so much easier for me than building a Docker image and putting it into a container orchestrator.


We are not going back full circle. We are facing new challenges and realizing how complex web development is.

1- There is no simple CSS start point; unless you don't want to be responsive, support mobile/tablets, add a print stylesheet. Then you have thousands of line. Time to add a pre-processor like Sass because CSS was not designed for such complex use cases.

2- There is no simple HTML either. Are you going to memorize how you coded that widget. Maybe use a framework (like bootstrap) to standardize things.

3- JS is not simple too; and the struggle is real. I remember coffeescript and then how EcmaScript 2015 tried to implement some of that nice stuff. Then you have TypeScript, because if your JavaScript is complex enough you might need stronger types.

4- Then you have JS/HTML/CSS; all of the three, interacting with your DOM.

5- Then you have JS interacting with your server; because who wants to reload the page multiple times. And there are different ways to do that too: Ajax, REST, WebSocket, GraphQl.

6- Now that things are getting too complicated, maybe add a build tool in the process (like Grunt) and a package manager too (npm). They might not be that good, so time to build new ones (gulp/webpack/yarn).

tl;dr: Developing for the Web is complicated. Any attempt to make it simple is going to add to the already complex ecosystem which brought us here in the first place.


Well, I completely disagree. Compared to developing sites 15-20 years ago, I think it's a walk in the park now. I feel like I can achieve far more with less code, the tools are more powerful, the code is easier to read, and there is much more consistency across browsers.

Did you want to make a box with rounded corners?

We can do that now... <div style="border-radius: 5px; border: 2px solid black;"></div>

It's intuitive and easy to understand. Try that 20 years ago. You're designing that in Photoshop, slicing up images for each corner, placing them into countless HTML table cells, messing around with all the table and cell heights, widths, paddings, margins, and borders, and struggling to get IE to render it correctly. That could have easily been an hour of work, and it would have resulted in a mess of images and inflexible code. Now, it's one line of code and 10 seconds of work. Did you want to fade the color of the border to red when the user hovers over the box? We can do that with two lines of CSS now. Good luck trying to achieve that in the past.

What do you mean thousands of lines for a responsive/mobile CSS layout, and no simple HTML?

https://www.w3schools.com/html/tryit.asp?filename=tryhtml_re...

There, it's a site with a header, footer, navigation, three columns, and it's responsive. 36 lines of CSS and 36 lines of HTML. Add 5-10 lines of CSS to reset spacing at the beginning, and that could literally be an entire responsive website.


The basics are easier yes, but the projects have become more complex. So yes, we can easily round corners instead of silly image slices, but like anything, making things easier just bumps up the expectations.


> because who wants to reload the page multiple times

happily, please, yes.

The browser caches pretty much everything but the HTML, so the payload is minimal. A fast SSR site is a dream compared to the myriad poorly implemented SPAs out there.

It's ridiculous going to, for example, an airline site; trying to book a flight and discovering that the damn reservation process is borked due to an uncaught exception in the latest update to the SPA (in which case the house of cards collapses, fun).

With an SSR the JS footprint is much smaller, which can lead to more stable/reliable apps.

I find the modern web to be a complete mess, but the loading spinners are nice, provided that the loading actually finishes (if it doesn't, look in the console and you'll likely find your answer in red).


>> because who wants to reload the page multiple times

> happily, please, yes.

A yes please from me, too. I like knowing, for a fact, that the input I gave your website made it to your server, your server processed it, and then I got back a response.

What I don't like is providing some data in a form, submitting it, being told, "Got it, buddy!", moving to another page on the site and then getting an error saying, "Acccctually something fell over. Sorry guy. Try again?"

Give me that slower but guaranteed feedback.


Off: Also there is the HTML Hell Page, which is older than many of the current web-developers but still contains the most relevant informations about web-design.


This is a strange way to define static. To me, a static site is one which doesn't make dynamic requests. Sure, it might load in static images (even static JavaSceipt!), but it would run fine if all resources were burned to a CD and loaded without networking.

A static site may be archived. It may be used with any browser at all. It will probably be useful even without a browser.


I think we just all have to agree that plugging a complex, dynamic component into a web site should be done with care and attention to the details.

If you need to offer some dynamic element to your website, such as a form that does validation, then just add a form that does validation with the most minimal of code, bloat and everything else.

Why is this hard?


We've been building Levels.fyi as a static website for the past 3 years! To create a new page, we simply hit new file on our text editor and start to fill the page. We have a simple build system + templating with Gulp and Nunjucks, but its really all we need.

Scaling a static site is incredibly nice & cheap too :P


> Scaling a static site is incredibly nice & cheap too :P

And secure. The only back end systems to attack are load balancers and web servers -- if you're self hosting those then just make the file systems read-only and now you're talking seriously tough security barriers between you and an attacker.


I have a repo of static HTML pages with my personal documentation (I develop firmware). When started it some years ago, I almost chose WordPress, since I know it from running a small store.

I didn't wanted the hassle of a DB, theming, etc. Why can't I edit webpages with Dreamweaver like 25 years ago?

Then I've found Hugo (https://gohugo.io/), set a template in like 5 minutes and now I edit my docs directly in VSCode and see them change in realtime. Also, the docs can be easily versioned (git in my case).

For a dumb like me (in the web-developing sense), static pages are a blessing. I am glad we are "evolving back".


The best part is that this is from an email newsletter and has broken HTML at the end.


Hello!

Article author here - I wrote this after realising that buzz words have entered our industry again in the form of the JAM stack terminology. Hence I took an approach of actually what is going on under the hood.


I am trying to bootstrap a SaaS business by myself and chose to split between SPA+rest for the product itself and static html + CSS + vanilla js for the landing page and documentation. I chose this because I've got limited resources and am not a designer, but with react I can get a nice polished interface quite quickly using things like material UI. I really tried hard to keep the tech as simple as possible while providing the quality I want and this was the best option.


I'm sure that https://svelte.dev/ is not the end goal of web development, but it is the next step.

What I want to see is more support for serving svelte and integration between front end components and backends. (I.e., the backend has a list of root components generated by the front end and a route can return one of those components in a statically typed way)


Everyone looks at tooling like Gatsby+netlify to support their dreams of a JAM stack, but vanilla html/JS/CSS can do a lot written by hand too. I recently replaced a whole WordPress blog by saving it from the browser and cleaning it to the point where non-devs have an easier time updating it on GitHub than struggling with the mass of config pages and plugin shortcodes they needed before.


As stated in the article, these ‘static’ files are being programmatically generated by backend services on remote servers rather than client side, but I don’t see how this could be construed as a paradigm shift back towards static site design. How can you call dynamic, per-request content generation as static design?


You serve the same HTML/JavaScript/CSS bundle to anyone who visits your site, and the JavaScript renders the request specific details on the client (by making more calls to the backend)


..and Rails, billed as the anti-enterprise framework in its early days, just added sharding support.


> Playing football is very simple, but playing simple football is the hardest thing there is

This quote is from Johan Cruyff - A football legendary. I feel it also true with programming.

Nothing wrong with the technical stacks, in my opinion: Do the right work, at the right time, with the right tool.


I don't even mean for this to be a plug but it's really easy to start up a VueJS site. I have done it multiple times.

``` npm i -g vue-cli; vue-cli create my-app; cd my-app; npm run build; aws s3 cp --recursive dist/ s3://my-site/; ```


> But the web is about presenting content first and foremost

Surely this just translates to user first ..? Devs moving from "it's about me" to "it's about you", perhaps with a bit of a shove from google, et al.


Static site generators work so well for a lot of sites. Why anyone ever thought they needed ASP.NET MVC or PHP or whatever else and a database all for the sake of a simple blog/article site is beyond me.


My personal site is still a bunch of files served up by a cache in front of Wordpress. It'll be cool again in a couple years, I guess.


The really cool kids are just using something like Jekyll that does all the templating at publish time and you just have static files you dump onto a server.


Ironically for those of us on SSR frameworks, it was fun to just watch the pendulum go back and forth, while we just kept delivering.


question... i keep hearing people say that when you use something like react, you need to have some sort of ssr in order to do any sort of seo. is that really necessary since google and other search engine can parse out js nowadays?


All of which makes the idea of some sort of JSON-driven API very, very attractive.


Would have been apt to use 'devolved'. 'Are We Not Men?'


If SPAs are the dark ages then what do you call the IE6/7 jQuery days?


The Wild Wild West!!? I remember doing jquery/ajax calling an API and while it was easy to churn out spaghetti or long js file code, it was still easier to read and pick up a few years later.


> it was still easier to read and pick up a few years later.

Compared to what?


Does that mean a new version of Microsoft Frontpage is imminent?


Honestly, static CMS like wordpress is overkill. A simple client side templating language and a JSON representation of your dynamic content is both sufficient and highly productive




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: