Hacker News new | past | comments | ask | show | jobs | submit login
Static Website Generators Are the Next Big Thing (smashingmagazine.com)
311 points by jimsteinhart on Nov 3, 2015 | hide | past | favorite | 181 comments



When I was heading up reliability at Netflix, we considered, and even began evaluating, turing the whole thing into one big static site. Each user had a custom listing, but generating 60+ million static sites is a very parallelizeable problem.

At the time, the recommendations only updated once a day, but an active user would have to dynamically load that content repeatedly, and at the same time, the recs were getting updated for users who hadn't visited that day. By switching to static, we could generate a new static site for you every time you watched something (which could change your recommendations), and increase reliability at the same time, so it would have been a much better customer experience. Unfortunately we couldn't get enough of the frontend engineers to buy into the idea to get it off the ground, and also they were already well along the path to having a data pipeline fast enough to update recs in real time.


I might be wrong, but I believe that 4chan does something similar to this: Every time a post is made, the board is updated and new static pages are generated. All the server does then is serve this static pages.

(I can't find any official reference to this though, but another user has referenced this some time ago: https://news.ycombinator.com/item?id=8060200)


That's also how the most popular guestbooks and forums in the 90s worked. e.g., WWWBoard, which seemed to be used almost everywhere for quite a while. A perl script would generate a new HTML file and update the index HTML for each post.


The only trouble was they didn’t do it atomically, so if two people posted at the same time everything would get horribly mangled.

There was a reason we moved to database-backed sites.


I surely hope that wasn't the reason.


I've seen several startups using this technique not only for their content but also for their API responses — they would just store them in S3 with the right headers and serve them through cloudfront. My guess is that this will only get more common with AWS Lambda and other "serverless" technologies.


This is a neat story. I'd love to hear more stuff like this from other well known companies. Stories of ideas that never got off the ground but could have worked.


How is what you're saying different from "normal" caching?


The question is whether you pre-warm the cache or cache a response after it goes through. If every single response is personalized, caching doesn't buy you a huge amount because the first request will be a cache miss, and the user may never call back a 2nd time. If you prewarm every single possible personalized result then there is almost always going to be a cache hit, which in effect is the same thing.

I guess theres also a question of whether you are caching data from the backend that powers a front end app, or actually caching the full front end itself.


This sounds crazy enough to work.


What did these frontend engineers dislike about the idea?


"turing the whole thing", Alan should be flattered.


I'm not entirely sure they're the next big thing. More likely, is that 15 years ago, it was people who were used to static sites began moving to dynamically generated ones as sites became more complex. They were the new thing then. Now we have a load of people who have grown up with dynamically generated sites and are suddenly discovering the benefits of static sites - thanks in part to the proliferation of tools that are easy to use.

It's the usual boomerang cycle of discovery and adoption.

Both types of sites have their benefits and it's a balancing act to use the right tool for the job. It's getting this right that comes with experience and an understanding of the current pitfalls of each. It's the rough edges that push people in the other direction and without the experience of the pitfalls of each, it's inevitable that people start predicting that one solves all the problems facing the other.

I fully expect the usual over reliance on the wrong type of tech for the sake of it being the current hotness and then an over correction in the other direction the moment we have a new generation of developers.


15 years ago the hot new general purpose CMS was Movable Type (https://movabletype.org/) which was... a collection of Perl scripts that ground out static HTML pages. In other words, a static site generator.

Then everybody got tired of waiting for their sites to rebuild every time they changed something and switched to WordPress, which wasn't static. Suddenly your changes showed up right away! Hooray! Then everybody got tired of WordPress falling over under any load stronger than a stiff breeze, so suddenly static site generators were in fashion again.

If you think of approaches to building a content management system as a continuum, with purely static at one end and purely dynamic at the other, you can see the entire history of the segment as a series of oscillations along that continuum. Each approach has drawbacks, but the drawbacks of the approach you aren't using always seem minor while those of the approach you are using seem painful, so the market just bounces back and forth between them ad infinitum as people rush to discover if the grass on the other side of the fence is really as green as it looks.


15 years ago this even made much more sense. I did a static site generator in good ol' Perl 5 in the 90s – and I'm sure, I wasn't the only one to do this. (Everything was based on file hierarchies, but there wasn't much structure or setup required else. Directories were mangled to intermediate data files in pseudo-XML, only to be reassembled, when there was an update local to that hierarchy. XML wasn't the big thing then, but as this had to mangle HTML anyway, it was somewhat obvious to embed data structures in tags.)

Back then, this really had some advantages: Updates were rare, but views comparably frequent, while databases were either not that performant or quite expensive. This way, you could serve everything from cache (remember Squid?), and, compared to a dynamic site, it was really quick, even in admin-mode. Given the modern machines and the lots of memory they come with, it's quite ironic to see this come back, while we saw the triumph of the LAMP stack on comparably modest machines. Nevertheless, if you've only a few updates and lots of views, it's a good idea towards green computing. (Save some mountaintops! [1])

On the other hand, there is some "magical" limit regarding flexibility and complexity, where things tend soon towards unmaintainable code. So, the judgement is left to you, respective to the purpose.

Edit: [1] "Mountaintop mining" at Google-images: https://www.google.com/search?tbm=isch&hl=en&q=mountaintop+m...


15 years ago we spent $250K for an off-the-shelf CMS that was a static site generator with workflow. Worked perfectly fine even if it was way overengineered.


You're totally right. I think there's a rediscovery of static sites going on, and an improvement in the tools, but it can't possibly solve every use case, or even the majority of them.


Every time static generation rears its head, I'm reminded of Yahoo!'s ... unique... take.

Back in 2006 when I worked for Yahoo!, and they had a CMS / template management system called Jake that statically generated templates for the PHP-based frontend servers to evaluate at request time. The idea was that you put as much of your logic as possible into the template generation layer, leaving the request-time logic to handle the stuff that changed request by request.

Now, that all sounds quite reasonable, but the two layers were written in different languages. The pre-template-generation logic was written as inline Perl (plus a little custom syntax, because why not), while the dynamic frontend logic was written in PHP. Perl was frequently used to generate chunks of PHP code to be executed by the frontend servers, and sometimes this PHP code wrote chunks of inline JavaScript. To say that debugging said JS was fun would be an understatement.


Jake was built for news articles and prerendered all pages. News staff was able to quickly localize templates (10+ lanuages, 20+ products). That was 1999 when we didn't use Javascript or CSS, tested pages on Netscape Navigator 2 and WML (for mobile phones) became hot topic. Later Jake was misused for all kinds of other products mainly because of locali[sz]ation, permission management etc. Yeah, debugging was hard. It was the right tool at that point in time to handle high requests-per-second. Yahoo hired Rasmus Lerdorf and switched to PHP starting 2002.


> Yahoo hired Rasmus Lerdorf and switched to PHP starting 2002.

Well, that's a bit of an exaggeration. When I left Yahoo Europe in 2005, there was still Perl all over the place both in Europe and the US at least. I managed the Yahoo Europe billing system, and that was mostly Perl on the backend, for example.

[small world, btw., courtesy of some minor profile-stalking: I interviewed with Ed about a position a few years back; your service looks interesting - I have a client that might be interested]


Yahoo Europe's progress was from Perl generated static html, to Perl generated PHP.

What Rasmus' hire did was push Yahoo to allow server-side scripting languages on the web server. And that's where PHP was the blindingly obvious choice. (though, I would not be surprised if there was a bun-fight with mod_perl...)


In the teams I worked with, there was thankfully no Perl-generated anything (but I know my boss had to deal with that with a couple of the other teams reporting in to him). We had plenty of Perl on the backend, though, and the US billing team constantly breathing down our neck to do their OneRewriteToBindThemAll in PHP (they had been working on that for a while when I joined in 2003, and were still working on it when I left beginning of 2006; it was a laudable goal - there were something like 8 billing systems worldwide at that point - but it took them a lot of time to figure out how to unify all the international requirements).

The funny thing was that on the instances I heard Rasmus talk, he complained we were taking it too far - he wanted simple PHP templates, not the kind of large PHP applications the US billing team and others were doing. He sounded quite exasperated about it last time I was at one of his talks.


Yup, sounds a bit like a project I did a while back - I had a C/C++ bindings generator that took an XML file and generated the C/C++ code to expose a C/C++ class to Javascript. So of course the logical thing for me to do was write the thing in Ruby.

That said, all code-generation tools - straight up generators, compilers, whatever - are a mindbending experience. Keeping track of whether a variable is available at generation time or runtime is trickier than I initially expected.


My version of that was parsing generated C++ code headers through Doxygen (XML format) and then writing up a Ruby script that would spit out generated C++, C#, and Java to expose the SOAP APIs we needed to do. It was fun, and I think the company is still using it, four years after I left.


FWIW, towards the tail-end of my time at Yahoo! I played a bit part in a team working on a more modern open source PHP-based version of the template localisation system that was at the heart of Jake. Remarkably this still exist on the Internet, though they haven't been touched since 2009 when most of us left.

http://sourceforge.net/projects/rthree/

Looking at the source now, I think we might have been trying to use every pattern in the GoF book in one project.


this is the first story I've heard that explains why Yahoo's navigation is so horrible and why your logged in state on your yahoo account doesn't follow you to different yahoo sites (business, mail, etc.)


That doesn't really explain it. Stuff that really needed to be dynamic would be. But we (I was at Yahoo 2003-2005) took privacy seriously and a lot of seeming quirks of login state etc. were well thought through and had good reasons. E.g. we were regularly reminded not to mingle data that identified browsers (that could "follow" an unlogged in user across sites) with data that identified users (as that would let us tie a userid to data the user explicitly had not knowingly shared with us).

And many particularly personal properties such as mail, or the billing system (my area), would take extra precautions about what information could be made available on other properties (e.g. what info from mail could be shown on the homepage) even if the user was logged in, to prevent leaking information that shouldn't leak. This would lead to extra logins that I'm sure seemed unnecessary, and logins where the user probably thought they were already logged in, but where non-personal information was keyed to the browser rather than their user id.

I'm sure there are bugs and unintentional quirks too - the system was crazy complex already in 2005, but I really hope they've stuck to their guns when it comes to how carefully they treated personal data back then.


Well, stuff like putting the "logged in yahoo account" indicator in the same place on each page, and indicating whether asking for a password was as a secondary verification (since already logged in) or if it was perhaps a different password.

Yes there were such horrible, glaring usability bugs it's quite amazing anyone had the patience to deal with it. I still cringe when I have to navigate anywhere on Yahoo at all, and actively avoid using anything associated with it.


The reason I like static site generators so much is because it allows me to treat my website as a program that outputs a website. Take some posts in whichever format you prefer, write a program to parse them as a tree of HTML nodes, insert them into templates for HTML pages, Atom feeds, etc. It's all just plain text code and markup, no stateful database with a dynamic web application in front doing everything.


Exactly.

There's a spectrum of evaluation strategies from eager to lazy. Programs may mix various flavors.

Lazy evaluation combined with memoization is a sexy and elegant approach to some problems, as are dynamic web sites. On the other hand, whenever possible to do so, it's hard to beat the speed and simplicity of handing over a precomputed answer like "42".

A (very loose) analogy:

Static generation of HTML is roughly like using Lisp macros to evaluate some things before runtime (at "expand time" or "compile time").

The resulting transformed code could be a simple literal like "<html> ... </html>". Or the code might need further evaluation at runtime -- which is roughly like the precomputed HTML containing JavaScript to do things at runtime.


Funny that you mention Lisp. I am writing my own static site generator (because there are so few of them and everyone needs their own) in Scheme.

http://haunt.dthompson.us


Cool. Mine's in Racket:

https://github.com/greghendershott/frog#frog

> (because there are so few of them and everyone needs their own)

A million monkeys.... :)


Oh, didn't know you were the frog author. Awesome!


The downside to static site generators is that you are constantly recreating the entire site, even for pages that won't change (which is most of them, most of the time) every time you run it. It's a function of:

   template(data) -> html
I'm a fan of the inverse of this. The template and the output (html) are the same thing; I don't separate them. Instead I just update the html when something changes. The function is:

  update(html, data)
Where update is a function you write that modifies the html in place.

The upside to this is that generating new content is cheap, you only ever update the things that need updating.

The downside is that your html output and updating function are tightly coupled; you can't change the structure of your html without also changing your update function.

I think that's ok though. Changing templates are infrequent enough to warrant the increased cost of fixing everything when you do change them.


Have you had a look at Pollen, by Matthew Butterick?


Yes. I like Butterick's "Practical Typography" a lot, and the rationale behind using Racket. That said, I don't actually like Pollen because it's just a string-based templating system. Instead, I want to use something that is based around s-expressions. My static site generator, Haunt, works with SXML trees instead of raw strings which is much nicer to work with.


> just a string-based templating system

Not sure what you mean. Pollen is derived from Racket's text-based DSL, Scribble. Like Scribble, Pollen lets you embed Racket-style S-expressions in your documents. Moreover, Pollen documents compile into X-expressions (= the Racket equivalent of SXML).


It's an interesting idea, and I see the appeal if the site is fairly basic, but I think there's one thing people are forgetting here.

You're outsourcing half your site to third parties, and basically letting them do whatever the hell they like with it. Disqus comments? Better hope the people behind that system don't decide to outlaw comments about the thing your website is about. Javascript embedded shop system? Good, so long as you don't need to modify the look very much and don't mind all your data being hosted in a different part of the world (like, the US for people in other regions).

And if they decide that all your data needs to be shared with the NSA or some other government organisation, then tough luck. If they hacked... well, tough luck again.

Without hosting such systems yourself, you're relying on a lot of third parties to be transparent, honest and respectful of your privacy (and that of your visitors). It's basically like a return to the days of free hosting and services like Bravenet.


Personally, I generally trust a third-party's shop system significantly more than I do something homegrown for a typical small business. Credit card security is hard. Why force everyone to reinvent the wheel?


Most card details are stored by payment processors, not your own server or a third party shop provider's servers. Paypal and Stripe and various others are probably more trustworthy than anything that stores the details with the shop itself.

But that doesn't really apply to a lot of things. Comments for example, do you really trust a third party more with those? Because if your site is in a grey area, then it's very possible their terms/country/whatever might require them to ban discussion of the topic. Self hosted means your rules, not a large corporation's.

Besides, any middleman is the weakest part of the chain if someone wanted to shut down a site or significantly cripple it without going through a court case. You may like controversy, but a large company would rather see the back of anyone that might potentially hurt its public image. We already see issues where internet mobs go after hosting companies and providers based on something someone said on Twitter. Every third party service is yet another potential target for them, and one that could buckle even more easily than the hosting company (especially if you're not paying for their services).


I've been a fan of static websites for a long while now.

In addition to page load issues, they also more or less completely solve the Slashdot effect (aka the Reddit Hug Of Death, these days). A competently-configured Nginx server on a 512mb VPS, serving static-only content, will handle ridiculous amounts of traffic without flinching.

Ever since a front-page mention on BoingBoing took down my feature film BloodSpell's site immediately after release in 2007, avoiding server load on a site has been high-priority for anything I'm launching that is likely to have bursty traffic.

It's nice to see usable tools for managing larger sites with a static generator developing and becoming popular.


But setting cache headers and letting nginx cache your site would also take care of load, right?


Do you - or anyone - have stats about the possible traffic demands of Slashdot/Reddit/HN front page stardom?

I'm very much in the Nginx/static camp, but it would be useful to know how bad the spikes can get.


I spoke with Alan Bellows from 'Damn Interesting'[0] on Reddit after his site hit the front page via a TIL post and he revealed a little about the traffic load.[1]

It was very interesting to see and was quite a lot of traffic for sure.

Bear in mind that the screenshot he posted is for concurrent visitors too.

[0] http://www.damninteresting.com/

[1] https://www.reddit.com/r/todayilearned/comments/3d3vct/til_a...


I had no expectation of encountering my own name in this context. If there is a German word to describe the thing my neurons just did, I'll bet it has a lot of umlauts.


Ha! Sorry! I know the feeling too, there's a guy who keeps a mighty big interest in a website I run and I stumbled on his catalogue of its/my goings on.

Hopefully this doesn't reach that level of awkward and your info was really cool to see in that Reddit thread.


No apologies necessary and no awkwardness implied--it was merely a surprise. A bit like being far from home and hearing someone shout my name.


Good lord. That's a lot of traffic.


I got around 50k hits to my "stupid cert tricks"[0] page from a combination of twitter, reddit and hackernews over a period of 48 hours. My site is static content generated with pelican[1], served by nginx and hosted on a $10/mo VPS. The page has one image, one stylesheet and a favicon. Peak rate was around 1.5 views per second (loads of the actual page - this does not include requests for assets). CPU usage was negligible, I'm sure it could have handled at least 25x that no problem.

0. https://rya.nc/cert-tricks.html

1. http://blog.getpelican.com/


Here's one for HN, though Reddit can get much worse: https://levels.io/hacker-news-number-one/


HN throws about 20k - 40k visitors over a day or so, in my experience (off the top of my head, without looking at stats). My stuff tends to be tangential to the primary interests of the userbase, though, so if you were very codey / techie you might see a lot more.

Reddit depends wildly on the size and activity of the subreddit you're featured on. Imgur shows view stats, so you can get some idea from that.

For scale on whether nginx can handle that sort of load, I've had tiny VPSes sitting there happily handling 200 SIMULTANEOUS users serving static files, which translates to between 500,000 and 3 million uniques a day, maybe more depending on your site design.


A TechCrunch article used to get you about 15-20K in a day. Might be less these days.


People are finally starting to realize that CMSes are horrible. They got us through a time period where the front-end developer is a scare resource, but now there's enough of them everywhere that we don't need WYSIWYG web content anymore. CMSes are eventually going to be relegated to the sole proprietor who doesn't want to learn HTML but still needs to maintain his / her own content-oriented website.

My company is handcuffed to a legacy custom CMS that's still using Rails 2.3.8. We have no need for it, as we have a front-end guy who would be perfectly comfortable using git, as he has me to ask whenever there's a problem. When you use the database to store content, you lose a great number of useful properties. I had to build a custom tool to search through the entire database to find encoding errors. And I had to keep re-adjusting it every time I found some hiding somewhere. It was annoying and painful. Content is code, not data, it needs to be managed like code.


Couldn't disagree more. CMSs enable non-developers to publish good-looking content, which is basically what the web is. You can have a small army of 10-20 writers and 1-2 front end devs styling templates for them. It scales, and it makes sense.

Content is certainly not code; it does not get compiled, it contains no logic. At best content is a property of an object with its structure defined in/as code.


> CMSs enable non-developers to publish good-looking content, which is basically what the web is. You can have a small army of 10-20 writers and 1-2 front end devs styling templates for them. It scales, and it makes sense.

Assuming that the content you're publishing is primarily writing, then yes, that makes sense. But most companies on the web are not primarily blogs or news. Hell, even the way news is going, a lot of these Snowcrash-type (or what was that famous NYT article called?) articles are too complicated for a template-driven workflow.

The company I work for is a marketing company, it primarily uses the Internet for e-commerce. Our marketing team gives their ideas to our front-end guy, including mock-ups and copy, and he goes ahead and makes all the HTML/CSS.

I think most companies on the web wouldn't be able to tell the difference between a static site and a CMS, because non-technical people won't even touch an admin back-end anymore. There's just too much to screw up to leave in the hands of a non-professional.


The article you're thinking of was "Snow Fall".

http://www.nytimes.com/projects/2012/snow-fall/


> But most companies on the web are not primarily blogs or news.

Sure, I agree. I've built product sites like magento.com and e-learning sites on CMSs. It's easy to knock out, say, 15 different page templates for different use cases. On bigger sites I build it out so you can customize your page by dragging sub-templates around to build a page out of it.

We've solved/attempted to solve the "don't screw up our backend" problem by exposing fields to fill out on a page. You want a headline, fill out this field and drag it to the top. You want a video background, paste the youtube link here, etc.


Actually the moment we will finally stop treating computers as glorified typewriters and use paper and desktop metaphors, content will indeed be code. Good web content in the future, I believe, should have interactive visualisations/simulations and a multitude of possible user interactions (see for example http://worrydream.com/Tangle/). Publications like the New York Times are already implementing this, I also expect it to happen for educational content (see Khan Academy for some examples).



However that only works at that scale.


It works at small scale too when you download a free theme on wordpress or whatever.


As a front-end developer I still prefer to think of ourselves as a scarce resource.

For selfish reasons I admit.


The thing that always pulls me back to wordpress is the ease of creation and hosting. Drag an image into your text, and pop it's uploaded, hosted, and linked to. Write something in the interface and bam it's online and reliable. Every static site generator I've played with (And there have been TONS), solves 80% of the problem.

I'd love an app that gave me that ease of hosting and creation and generated a static site from there (hook it up to an S3 bucket or something). I'd pay for sure.

Currently I'm using Nginx w/ SSIs for lightly dynamic sites. Works well enough and is very very simple.


Have you tried Webhook?

http://www.webhook.com/


Lest you close it after seeing the dollar sign like I did:

Keep reading, it's just the optional hosting that's paid, the software itself is open source.


FYI: I've put together a showcase of the world's greatest static sites [1] (using the Jekyll machinery). Examples include: Bootstrap, Google Polymer, Facebook React, Open Data Handbook v2, PHP: The Right Way and many more. Cheers. [1]: http://planetjekyll.github.io/showcase


I love pelican as a generator, it's great

The one static generator I wish there was (unless there is one and I just haven't found it) is one that would take a tree of code files and display it kinda like github does, in a browsable file browser hierarchy with syntax highlighting when drilling down to individual files.

Kind of like a precompiled static file browser, there are several dynamic file browsers around but they all require server-side code (php usually) to do the directory listing and so on, but I think it should be possible to precompute all the directory display pages with symlinks to the individual files, and do the highlighting in those in JS/CSS

I might end up writing this at some point as it's definitely an itch I'd like scratching unless it does exist already and somebody kindly points me to it


There are a few projects that will take a tree of source code and generate HTML, like LXR:

https://en.wikipedia.org/wiki/LXR_Cross_Referencer

And I think there is one for GNU GLOBAL. The point of these is usually the cross references, not necessarily making it pretty though.


The only problem I feel this really solves is caching, and for that specific problem, generating static pages may be a (work-aroundy) solution to consider, but in general I don't think static website generators are going to be the next big thing...

When you look at page speed, not much of the slow-down is from servers delivering pages, but browsers having to digest HTML/CSS/IMG/JS

I guess it all depends on what you are trying to achieve. Thanks as always for the article, interesting to see someone taking this with both hands.


The more important problem it solves is security. Caching can easily be handled by adding something like Cloudflare in front.

What local static site generation with html,js,css pushed to remote server solves is giant problem of insecure code.

> more than 70% of today’s WordPress installations are vulnerable to known exploits (and WordPress powers more than 23% of the web).

I did some consulting for a company that was being destroyed by wordpress installations being hacked. When they first started offering wordpress for a blog option (blogs were not their main offering) the threat profile was different. Fast forward six years and they were getting hacked daily.

I recommended that they move their wordpress blogs to flywheel, and have flywheel manage wordpress for them. It worked and they were able to focus on their main offering, the real reason customers were paying them.

Services like flywheel are one answer to the problem of insecure CMS code. The other, and better in my opinion for a lot of sites is to run the CMS locally, keep the database local, and push the rendered code to the server.


> When you look at page speed, not much of the slow-down is from servers delivering pages, but browsers having to digest HTML/CSS/IMG/JS

I think both can be to blame, although you're right that JS load is becoming more and more of an issue. I used to work on a site that used Joomla and literally hundreds of separate database queries were executed for each page request. That is going to put a huge strain on server-delivery time, and it's unsurprising that this site couldn't handle much of a load before falling over.

Static pages are really so much faster than any dynamically-generated site I've seen. Still, I agree about JS - it's possibly time for a 'back to basics' approach on client-side scripting akin to the static page revolution that has occurred on the server-side. At the very least, I think the 'many scripts, many sources' issue needs to be resolved.


I think you're vastly overestimating how well engineered most SME web-platforms are when you say that the client-side is slower than the server side. There are a huge number of server-side asp.NET webforms/JSP/php sites out there which are far slower server side than client side.

And it's not just legacy code, new applications in 2015 are still using those mostly server side technologies without too much done on the front-end.


And thank god, these new js heavy frontends really hammer the performance of browsers and are often buggy.


Given that the browser should be about interactive documents, that is how it should be.


I can't wait for ghost's API.

I've been following their repo for a while, hopefully by next year they'll be far along for it to be usable.

Ghost has an excellent editor and it would be awesome to have a static site built in any way you like that links straight to your blog posts via API calls.

They have a trello card that mentions it https://trello.com/c/QEdjRlgK/67-open-public-api-via-oauth-a...

and are working on it as we speak :) https://github.com/TryGhost/Ghost/issues/4004

I know there's a wordpress API as well but I find wordpress too bloated IMO.


I've only done static websites for a long while now. I created ThinCMS as a browser-based tool for building and publishing static web sites. It came about after I learned XSLT well enough to bend it to my will. I used it to build several public and private web sites, including two iterations of longwoodgardens.org (they've since moved to Drupal) and pittsburghtoday.org (where it is still used). XSLT ( and probably other static templating engines) is perfectly capable of generating complex nested navigation. The templates themselves are nested three layers so as to keep things DRY.

The PittsburghToday site is representative of the idea that a static web site is only static in the technical sense of the back-end content serving. The front-end is still dynamic since the data for the charts is being obtained from Google Docs and the Twitter feed from Twitter, etc.

I always felt like the odd man out, so I am glad to see strong interest in static web sites nowadays.


Now days I'm experimenting with client-side nested composition with HTML includes. But I'm also giving ASP.NET MVC 6 a spin.

However you go about generating the HTML that a visitor sees, you still have to content with how to create tooling that satisfies content authors. My position has always been "use whatever tools you want - I'll figure out an automated scheme to convert it". I think that with static site generators it is easier to have that separation of authoring and publishing.


As the primary maintainer of Pelican, a Python-based static site generator, I've given several talks on the resurgence of static site generators and in which cases they are (and are not) a good fit.

My most recent talk was two weeks ago at PyCon Japan. Following are my slides for that talk, in case that's useful for anyone who wants to get a better understanding of SSG history and advantages/disadvantages: http://justinmayer.com/talks/

Also happy to answer any questions here, of course. (^_^)


I've been using static site generators a lot for client work over the last two years. I started off with Jekyll, but unless you are having under 100 pages, the build process gets painfully slow (trust me, I have micro-optimised).

I've since started using http://gohugo.io

It's lightning fast with 1000s of pages, and quite easy to pick up.


I tried both Hugo and node.js-based Metalsmith[0] for a new version of my personal website that I launched just about a month ago. Ended up choosing Metalsmith because of its super minimalist everything is a package approach, but Hugo looked super nice too.

What I like about Metalsmith is you build you own workflow. A basic installation does nothing but copy files from the source directory to the destination directory.

I'm working on a beginner's guide to Metalsmith right now, in fact!

[0]: http://www.metalsmith.io/


go templates are not nestable. Hugo lacks support for pongo nested templates, and refuses to include it.


... and easier to deploy.


I would be equally interested if the article was titled "Why Static Website Generators Are Awesome". They are not the "next big thing", they've been around for years.


The next big thing doesn't mean they are new though, just that they will be the next thing à la mode.


I am working on a project, http://blogful.me, that combines a hosted static blog generator with a solid admin backend including post syndication, an embedded analytics dashboard, authoring tools, and an API if you don't want to use the frontend.

There definitely appears to be a lot of interest in this space because you get the best of all worlds. Static site generators definitely seem like the way to go for all but actual web applications.


I'm still looking for a static site generator that's as easy to use as Wordpress (so with a UI, not markdown). Anyone know of something that fits the bill?


I was looking for the same thing, so I built it. It's a WordPress plugin that outputs a static copy of your site to a ZIP or a directory of your choice. The thought being that you put WordPress on a subdomain (e.g. wp.example.com) and have the main site (www.example.com) served by the static copy.

https://wordpress.org/plugins/simply-static/

It's still a work in progress (it launched a little over a month ago) but the feedback so far has been positive.


I did something very much like this for a while. I'd use WordPress to create content on my laptop, then I'd crawl my own site using wget to collect the results and push those up to the real website. Turns out there are a few gotchas that simply-static didn't address, but I wrote a blog post about my experiences here.

http://pl.atyp.us/wordpress/index.php/2013/04/using-wordpres...

The older posts on my site were all converted this way, though the newer ones are done directly in Pelican.


Crawling your own site to output a static site seems so simple it is genius. Set up a cron job every X minutes. Just... so... simple. I will do this!


When Healthcare.gov was first built, before the "enroll in insurance" feature launched and failed, it was a static site managed by a Github-based editor called Prose.io.

https://developmentseed.org/blog/new-healthcare-gov-is-open-...

https://developmentseed.org/blog/2012/june/25/prose-a-conten...

I have not used prose.io, but the company that built it, Development Seed, is strong. You might have heard of one of their other projects: Mapbox.

I have no idea if prose.io is being currently maintained. Development Seed was in the middle of transitioning to doing Mapbox full time, and Healthcare.gov was their "one last" consulting project. By all accounts their work on the static part of the site was great, but was totally overshadowed by the failure of the enrollment feature.


You might want to look into CloudCannon (http://cloudcannon.com/). It's still an early product, but the idea is pretty cool. You auth CloudCannon to your Jekyll GitHub project and it exposes a Web GUI that allows content editors to do CRUD operations. CloudCannon then commits those changes directly to the repo. Then a GitHub webhook with something like Travis CI can build the site and deploy the changes.

This is great because it allows engineers to maintain a simple static website and content editors can use a web based GUI that is kind of like WordPress (though CloudCannon has a long way to go on the usability of its web GUI). We used CloudCannon at Hillary for America for a small project. It ended up not being the right tool for the job, but I definitely think there is a use case for CloudCannon and the team behind it is super open to feedback and iteration.


If you're looking for a static site generator that's as easy to use as Wordpress, you could consider using Wordpress. Wordpress caching plugins can generate static HTML which then gets served to visitors directly by Apache or Nginx, without hitting PHP at all.


What I'm looking for is a plugin that can make a entire Wordpress site static to freeze sites in time so they don't require Wordpress, PHP or MySQL.

Another workflow could be to use WordPress locally in a VM to edit, then generate a static site you upload through git or whatever to a bare minimum server.

Any plugin that fit the bill? I remember searching but never found something perfect.


...and this is why we're not moving to SSGs even though this almost exact article gets posted monthly here.

Its trivial to put in a cache level that generates and stores static html in from of WP, Drupal, etc. So you get both worlds; the tools that dynamic CMS's give you and the performance of a static site.

I think it took me 5 minutes to install varnish on a WP server I have. Varnish delivers these pages straight from ram. My page load performance is fairly absurd. If that's too technically daunting or your webhost doesn't support varnish, totalcache is also good. Boost for Drupal is good too.


It's far from optimal for deployment, though, as you still occasionally depend on that WP site, and then have the hassle of handling availability etc.

I wish more CMSs would support exporting all changes directly, so that you could e.g. rsync the sites out to edge servers or a CDN and be able to use it for full failover too without suddenly having stuff expire from the caches at the most inconvenient times.

Movable Type used to be able to export everything to flat files, for example (though, I guess more out of necessity back in a world where not everyone would have a host that could/would provide PHP).

I just started redoing my blog to iterate over all the pages and wrapping my Sinatra app to fake requests to generate a full static copy that way. Lets me optionally serve it up dynamically when writing, and then generate a static copy. It's just a few hours dirty hack for now, but I'll be doing this more often - it's so nice to have most of the content completely stateless.


I believe WordPress is bundled with a plugin that does that, right?

I think the main problem is that it requires an extra step when you're running Apache (modifying .htaccess) to become truly static.


We're very close to launching a product that serves this exact purpose. [1]

Our primary motivation was being able to build a static website as easy as using WordPress - so we figured, why not turn WordPress into a static site generator?

As notacoward mentioned, there is a few gotchas so we decided to launch it as a SaaS in order to abstract those things away and make it fully hosted on a CDN out of the box. We still give you full access to the WP install though, so no vendor-lock in or shackles for the customer.

Happy to give anyone a demo if you're interested. You can contact me at mathias AT dotsqua.re

[1] http://spudpress.com


I was toying with the idea of building an electron based UI frontend for Jekyll, where you'd start it up and point it at a jekyll project dir. It would essentially work like a stripped down cms admin running locally, and update your jekyll project files. Does something like this exist?


Dreamweaver ;P


Last I used MovableType (a few years ago), it seemed to fit the bill.

edit: apparently it's mentioned in the TFA


I wrote a maven plugin to put all the markdown files together. I use iA Writer as the UI.


Bricolage, in theory, but I never got it running.


When I worked on a video search engine for UK football (soccer) clubs, we made the whole thing static.

We figured that inputs to the search was from a static range, i.e. these players, those games, that league, this type of incident (foul, goal, celebration, etc).

Then we pre-calculated all possible combinations and fired them through what we called a "cache cannon".

It was highly parallelizeable, simple to store on disk (we stored JavaScript files whose names were the form inputs), and worked extremely well.

Even for something like a search engine, unless you're doing full text search over a very wide corpus, you can look at pre-populating a cache and that cache actually being stored on your web servers and being directly addressable.

The design above, allowed that search engine to work over the weekend peak of 2 million users. That's where it shone... we just did not have to worry about the thundering herd with a pre-populated cache.


... because Javascript and external APIs can now do so much.

If I can do everything with JS on the visitors browser, why not host some shell HTML on S3 and never worry about a server? Maybe hit AWS Lambda if need be for one specific thing? Dunno. The age of the do everything server seems to be coming to a close.


Progressive enhancement truly is dead. :(


FYI: Join the "movement" and start a static site user group. For example, I've started the 1st one in Europe, that is, Vienna.html [1]; others in the U.S. include Static Web Tech in San Francisco [2] and {Static is} The New Dynamic in New York City [3]. Cheers. [1]:http://viennahtml.github.io [2]:http://www.staticwebtech.com [3]:http://www.meetup.com/The-New-Dynamic


Ok, "static" here means no RDBMS-backed website. But you can still use statically generated JSON resources from a db once off. These resources can then be "filtered" and "combined" without the need for databases (not using the words JOIN or WHERE carefully).

Sounds like a great idea to overcome the need to obsess about connection multi-plexing.


"Statically generated JSON resources" implies client-side rendering, and I suspect you'd find some disagreement over whether that's really static. If there's only one such resource per page, then maybe, though you'd still be losing some advantages wrt caching. If the JS running on the client has to fetch many such resources and stitch them together, then it's functionally equivalent to a standard dynamic website and shares many of its drawbacks. Sure, you distribute some of the top-level logic to clients, but the server still has to find each of those individual resources and that's basically the same load as a DB (if not worse as files).

The reason many people (including me) have gone to static generators is not just the static part but the generator part as well. It's not just static but preprocessed, no further operations necessary except to deliver dead bits. That captures all of the caching, security, and other advantages in a way that hybrid approaches tend not to.


> "Statically generated JSON resources" implies client-side rendering, and I suspect you'd find some disagreement over whether that's really static.

> If the JS running on the client has to fetch many such resources and stitch them together, then it's functionally equivalent to a standard dynamic website and shares many of its drawbacks.

Whilst you're talking about extremes (e.g. rendering a page using Javascript), it seems to me that the most abundant use for AJAX-style approaches on static sites is when there are a few "value added" parts which rely on a DB. For example, a static blog with Disqus comments.

As other commenters have noted, the ability to push bits and pieces of functionality into JS has tipped the balance in many cases, e.g. from "if you want comments, it'll all have to be done in PHP" to "there's no point rendering this on demand, we can do the dynamic bits in JS".


How do you figure that loading a static file is the same as a DB query? Even if the actual finding of the bits on the hard drive and sending them across the wire is close to the same, you don't need a database running. You get to shut down an entire process, or depending on your architecture, an entire server.

Heck, once CDNs and caching get involved, I wonder how many requests will even hit your HTTP process at all.


Loading the set of files needed to satisfy a request is equivalent to a database query. Databases and file systems have to deal with similar issues when looking up a single item, so there's a clear correspondence there. The difference is that a database can do a JOIN but a file system can't (directory traversals really aren't equivalent). In that case any such logic would have to be in the client-side JS, fetching objects and quite likely imposing even more load than an SQL query would have. Being able to shut down that database process is good in a lot of ways (especially security) but reduced total load isn't necessarily one of them. You might just be shifting load to the the web server and file system. That's why I said the generator part is as important as the static part. That solves the result-aggregation problem at edit/compile time instead of request/run time, which is good regardless of whether the back end is a database or a file system.


The loading of static files for HTTP is significantly different from accessing the same data via databases. The biggest difference is the need for a CLI (call level interface) connection object in db access. And those things are resource hungry operations (authentication, prototocol management etc). Most times you have to pool connection objects which in itself is a resource cost.

The DB is truly the scalability bottleneck.


The overhead you're talking about doesn't seem to be in the database itself, but in crappy (possibly language- or framework-specific) interfaces to them. That's an application issue, and I was talking about system issues.


That's the cool tool they used to test the performance of the site:

https://performance.sucuri.net/


You know what I've always wanted, but my searches have led me to believe that it (inexplicably) doesn't exist?

I want a simple site generator. I don't want markdown, I don't want a fancy templating engine. I want some simple templating system that takes in normal HTML and generates pages from simple templates I define. I want to shove in some arbitrary HTML and have it spit out a site using some base templates.

To the best of my knowledge, that doesn't exist. It would be perfect for someone like me who wants to keep a website updated, but doesn't always want to run PHP on the server for something as simple as that.

I implemented a shoddy version of it on my own, but it's far from ideal. I'm pretty astounded there's not a well thought out version of it out there, considering how useful it seems it would be.


Have you tried Middleman (mentioned in the article)? I've used it for a few years and found it incredibly simple.

https://middlemanapp.com/

I built a very simple admin-interface with it,

input: https://github.com/lms-io/scormfu-admin/blob/master/source/i...

output: https://github.com/lms-io/scormfu-admin/blob/master/build/in...


That looks exactly like what I was envisioning. Thank you so much!


Jekyll and Middleman have already been mentioned but another is nanoc [http://nanoc.ws/] which I like because it is not so blog-centric. You can get a good overview of how it works relative to what you want just by reading the Rules doc page [http://nanoc.ws/docs/reference/rules/], but essentially it does nothing by default except dump your HTML input into the body of the default layout. I use it for moderately complex as well as dumb I-just-want-a-few-variables-to-stay-updated things, the only downside being multisecond compilation times with many pages.


Jekyll (and probably most other generators) works fine with HTML, you don't have to use markdown.


Can you clarify what you mean by "simple templates"?

It may not be along the lines of what you're hoping for (it may not be simple enough), but I've found Stasis[1] to be a powerful tool for static site generation.

You can write plain old HTML pages and/or fragments, with your desired level or genericity, and then run them through a set of transformations[2] to fill in content, set attributes (e.g. classes, styles, whatever), un/wrap elements, and so on.

[1] https://github.com/magnars/stasis

[2] https://github.com/cgrand/enlive


I wrote something that works like that, called templer:

https://github.com/skx/templer

In the past I wrote all my HTML pages by hand. Then I started just writing the "body"-area, and concatenating a header, footer, and the body together to generate output. After that I started to make more changes.

In the end it grew, as these things do, but at the core templer allows you to put "stuff" into a "layout", and generates the output. It might suit you, or it might not. But there are many similar tools out there.


This is exactly what Dreamweaver templates do. You define the template files with editable areas, then you create individual pages based on the template files. When you change a template, Dreamweaver regenerates all the page files to reflect the change. It's all done in HTML with no weird compilers or parsers needed. SFTP the files into your public webroot and you're done.

This was one of the original "static site generators". And it's still used today, for example by the Smithsonian for some of their museum sites.


A fancy templating engine can be a simple templating engine by just not using all the features.

Statsi Site Generation is one of those thing s where I think Not Invented Here is a legitimate point of view. Your exact use case is never going to match the exact use case of other people so you should roll your own rather than trying to customize an existing solution.

Plus writing a SSG is super fun, it's like the most 1960's thing you can do (Input Docs->Processing->Output Docs)


Dreamweaver?

People can hate on it all day, but that is exactly what it does. There is a reason it was such a huge tool 15 years ago.


Yep, Dreamweaver has an excellent template engine.


Anyone remember Noah Grey's "Grey Matter"?

It was an inspiration to what became WordPress

https://en.wikipedia.org/wiki/Greymatter_(software)


I do remember Greymatter when blogging was young, it was Perl-based and it was mostly replaced by Movable-type, also Perl-based before WP finally came to dominate the scene. Textpattern was also popular for a while. I believe Wordpress derived form ExpressionEngine?


WordPress derived from B2.


Most of the static site generators have serious usability problems. They are simply not usable for many users using WordPress today.


Usability problems like what?


> Usability problems like what?

Understanding of the command line, delayed WYSIWYG feedback loop, FTPing/synchronizing files. Some generators don't even tackle "pretty, easy to install templates a la Wordpress" either.

All surmountable, but I haven't seen an "all in one, easy to use" fix yet.

I'd love if I could recommend a single client-side app for people to use that did it all. Something like Coda but tailored for beginners?


Frankly, I'm not sure it matters. Give the average user a CMS with a WYSIWYG editor and you soon end up with a hodgepodge of fonts, sizes, eye-jarring colors, general inconsistency, and 2MB images rendered as 48x48 icons.


Yeah, and the point is, they were able to accomplish something...

I'd be surprised if those same users could even manage to create a single, text-only post in most (pre-setup!) static generators.

Never underestimate agency. A working, ugly solution beats the hell out of a better solution that they can't understand.


Like George from sales making a new page or editing a call to action.


I think that Static Website Generators, aren't by them self the next big thing. Mostly is about how the codebase influence on the content of the website, and use the the power of the existing versioning tools is indeed an advantage. Instead by them self are quite limited.

IMO, they are the next big thing if their are contextualised in Micro Service Structure, so that's why I build Monera - http://github.com/Ideabile/monera

What you think?


Haven't done any benchmarks myself, but I'd be keen to find out if a static site loaded up with JS design elements, a product store, comments and analytics code would load any quicker than a CMS with PHP caching, system caching and microcaching to handle bursts on a lightweight webserver such as Nginx.

The article is frustratingly biased in this regard. Static sites should just play to their strengths, otherwise you probably want a CMS that will act like a static site when it needs to.


It's not even about performance as it is about security. Imagine having to install only basic web server, with no database server and X language interpreter.No need to worry about sql injection and such. And you get performance boost for free.


Accepted that SQL security is pretty optimal with statics. How about security of your data when buying through 3rd party JS stores, commenting through 3rd party add-ons, being tracked by 3rd party snippets, etc.

Hard to deny that the additional functionality most people end up wanting beyond a simple online journal brings additional security risks, whatever the framework.


I agree with the article and think static web sites is the way to go if the read/write ratio is high, and where the view is not unique to the user.

However, quote > "The static version is more than six times as fast on average!"

This must be an engineering problem, especially on easily cached content. Serving static web sites Does require computation. But the current tools are very well made and optimized for it, witch is not the case with most CMS systems.


I've done lots of these kinda performance tests against all kind of dynamic sites, and higher end managed hosting services like WPEngine, etc...

Once in a while someone manage to get CDN hosting just right, but it's really rare, and it's not something you can simply automate with a dynamic site (like we can for static sites with netlify). Typically the result is identical to the Smashing Magazine Site, often a lot worse. Smashing does a good job of caching at their origin datacenter, but their HTML doesn't get cached at edge nodes. Many other sites does a far worse job of caching at their origin.

It might be true that to some degree it's an engineering issue, but if it's one that hits 95%+ of all sites built with a dynamic approach and can be completely eliminated with a static approach, then obviously it might be better to shift the balance and default to doing thing statically instead of reaching for Wordpress/Rails/Drupal/whatever for each new site...


I think CDN is over-kill/hype. If you do everything right, all you get is better latency.

If your dynamic site loads slower then a static site, you are probably doing needless database round-trips, redirects, synchronized writes, or html rendering.


Yeah, caching HTML directly on a CDN basically only gives you better latency.

..Which in turn gives you better page rank. ..Which gives you more traffic.

But that's it.

..Well besides that it also gives you lower bounce-rate. ..Which means higher conversion. ..Which means higher ROI.

So there's that.

:-)


That is best case scenario ... But probably a premature optimization. You also have to look at time to first byte, total time, and client rendering time.

If it takes like ten seconds to render the site on the client, then 10ms gain on connection time wont help much.

Test tool: http://www.webpagetest.org/video/

When all css, fonts etc are cached on the browser client, then there's almost no gain with CDN.


I am working on Open Source Static Website Generator/Manager for College Professors to create/manage course webpages. Its actually a Desktop App which runs on windows, linux & Mac and can directly push the webpages created to Github-Pages, SFTP Server. Do check out the project at https://github.com/navya/Kalam .


"With the maturation of browsers, many features that used to require dynamic code running on a server can be moved entirely to the client. Want comments on your website? Add Disqus, Isso or Facebook comments. Want social integration? Add Twitter or Facebook’s JavaScript widget to your website. Want real-time data updating live on your website? Add a squirt of Firebase. Want search? Add Swiftype. Want to add live chat support? Olark is there. Heck, you can even add an entire store to a static website with Snipcart.

The list goes on and on, as a whole ecosystem of purely browser-based add-ons to websites is emerging. Apart from that, modern web apps built with Ember.js, AngularJS or React are often deployed entirely as static websites and served directly from a CDN with a pure API back end that’s shared between the website’s UI and the mobile client."

--

I'm not sure I understand. It doesn't seem to me that a fully single-page, AJAX web site is truly "static". If much of the utility and content must be paged in via client-side JS calls, that too will contribute to load time and the same problems that are attributed to dynamic document generation. It may be all asynchronous and fancy, but from a UX point of view, the content isn't there until the data's retrieved. How's this any different than arguing for a grid of IFRAMEs?

After all, if your page is a minimal HTML DOM harness for a bunch of JS, can one really be said to have "loaded" the page simply in virtue of having loaded the stub HTML?

Or is this argument based mainly on either the implicit premises that (1) not all the functionality and components are used at once? or (2) that much of any given site's functionality can be off-loaded to third-party components (e.g. Disqus) which can all be loaded in parallel from different network sources?


I host my company's web site on AWS Cloudfront using my homemade static website publisher which minifies my JavaScript's ,css and html and gzips them before pushing to Cloudfront . The pages load fast but the downside is making changes to page like fixing typos is not as simple but that probably an issue with my generatorvas well as Cloudfront cache configuration. I don't use Jekyll or Hyde or other static publishers because I wanted to write one in clojure and I figured I could write one in clojure faster than what it would take me to learn jekyll etc. You can check out and (critique) my website by visiting https://www.videocloudmanager.com. I run business video hosting service.


That page needs a proofread! Hope it's not causing you to lose conversions there. Nice and snappy though, seems like you did a good job with the generator.


One static site CMS that I know of is: http://www.webhook.com/

It's open source. Design and content is edited collaboratively and it deploys a static site.

Are there any other CMS systems designed to deploy static sites?


I think you guys are thinking small when thinking about a static site. Change the content dynamically with Javascript only is still considered a static website, this doesn't make the frontend static at all, it's very dynamic! But no more need for PHP, Perl, etc.

Best Regards


I do something I call Dyna-Static.

I can run a static page off an Apache (or any wwwserver) instance. Just chuck files in /var/www/ where you want them.

Now where it gets interesting is I use Node-red to generate the pages; content and all. I want headers? It's a variable. I want ads? It's another variable Google provides. I want chat? Easy ( I can do it with nodered or 3rd party). I can bridge that webchat with my or someone else's IRC room.

Now, I can script it so the pages are updated from nod-red server to webserver. They can easily sit on the same box, as node-red takes few resources.

And the kicker is that I could get that done in an hour or so. Check out Node-Red . It really is that amazing.


I think static site generation will simply be a feature of a build tool like Webpack or Gulp, maybe as a plugin, and either way there will be an api for developers. Or it will simply be part of a larger build chain / automation system somehow.

Static site generator doesn't mean there's no backend. A website is called 'dynamic' when its operation depends on communicating with a server. The JS logic delivered to the client can range from animation to async http requests.

The distinction between "static site generator" and Webpack / Gulp is very gray. It all depends on what you want to do with your client-side JS logic.


Sadly, I think it will never work for end users. I myself also don't update my blog very often, partly because writing new posts is still too cumbersome (with Jekyll). I am a developer, but I don't use Markdown and Liquid (or what was the name) on a daily basis, so I still have to look things up when writing.

Now I tend to think that the best way would be to use a random CMS (like Wordpress) and mirror it's output as a static site. That way, the dynamic part the end user uses for creating content could be behind some secure login-wall, and the public site would just be static.

As for comments, I guess they just won't be supported.


I think the simplest static site generator is a command that visits all of your routes and saves the response html into a `build/` folder.

That way you can use whichever framework/stack/templating/database you're already familiar and productive with, and in the end you're just deploying a static build folder from localhost.

I started doing this when it came down to hacking Jekyll to implement something that's trivial to do in a microframework, so I went with microframework + page caching. I do the build and deploy with a gulp task that I'd like to generalize into a gulp module.


I would suggest keeping it simple and scripting wget instead: https://www.gnu.org/software/wget/manual/html_node/Following...

Often all that is needed is -r --no-parent


To an extent I agree I think we are just getting better at recognizing the security and performance optimizations that are staring us in the face when content doesn't change rather than having complex websites completely static as static site gens do. I think we will see an increase in static elements mixed with dynamic elements in a more comprehensive way.

------

I host 10s of more or less static sites (Contact forms being the most dynamic elements) which are generated on the spot from one PHP (laravel) installation.

Anyone know the best way to cache the html/css statically to serve?


Partition template parameters into request-dependent and request-independent sets. Memoization with persistence of the request-independent template instantiation. "static" then is the special case of no request-dependent template parameters.

I think I'll write an article "pure functions with memoization are the next big thing!" Except they are not "new" because they've been around for decades. The only difference is the Web 2.0 uberkids haven't "discovered" the concepts yet.


... Next Big Thing For Minimally Dynamic Sites

If your content doesn't change frequently and/or the costs of regenerating the static content is minimized for you, great.

At what point do we see static sites take a fair share of the top-X-trafficked sites? Top 100? 1000? 1,000,000?

This is probably great for a small corp's info site... but then the client asks for a contact form or members/admin secured area, and there we go down the rabbit hole again.


> At what point do we see static sites take a fair share of the top-X-trafficked sites? Top 100? 1000? 1,000,000?

Honestly, most media sites could (and probably should) be static. Think of Time, or Cracked, or CNN: a lot of content, which could be regenerated once and viewed by millions of people per regeneration. Comments could be grafted in with JavaScript (which would suit me just fine, since I don't read such sites for the comments anyway).

> This is probably great for a small corp's info site... but then the client asks for a contact form or members/admin secured area, and there we go down the rabbit hole again.

It's not an all-or-nothing thing; a web server can serve both static and dynamic content, after all.


I tend to think of it in terms of ratio. Of course, very few sites are 100% static, but a site with thousands of static pages and a contact form that posts to a php script is still a 'static site' in my book. You could also define it on a request method basis, since it doesn't make sense to POST to a plain html file.


At netlify we're seeing more and more large projects being built with this approach.

Some have more than 10k pages, search functionality, internationalization and large content teams behind them. Expect some interesting case studies :)


Think of web app architectures like data structures. Most websites should be optimized for reading, not writing. Static site generators are very optimized for reading, not writing. Wordpress sites, on the other hand, are more optimized for writing. It makes sense that most websites should choose static site generators - it's just a better "data structure" for the needs.


My favorite kind of static website generator is lazy, just in time, and supports TTL based regeneration on a per page level.


I'd love to use static site generators for my client work but they usually ask for features like e-commerce that completely rule out it's use.

Usability is also an issue. Wordpress is a far more friendly environment for them to make changes or create a new post than creating a text file with specific formatting and running a script.


There are services such as snipcart and foxycart that you can trigger via html and javascript. Like most non-locally hosted ecommerce solutions they send you to a subdomain for the final checkout. I have tested both on a static site and both work very well. Snipcart is easier to implement but lacks customization. If you are looking for heavy customization (like custom products beyond just sizes, colors and options) without having to build out your own system, check out foxycart.


I'm currently (slowly) working on a static blog that utilizes only HTML, JS, and CSS. I liked jekyll, but wanted something with no backend technology requirement:

https://github.com/milge/lilblog


Does this mean that developers and admins will get paid less, because their jobs just got a lot simpler?


I use MiddleMan as a static site generator for a page hosted on my raspberry pi: http://pi.tafkas.net

As I am more of a python guy I wonder if there is a similar (as in not primarily for blogging) generator for python?


Have you tried Pelican? Staticgen.com lists all of them, and it tells you what language each one uses.


I have looked at Pelican but isn't that more for a blog (chronological updates vs. static non-changing content)


I find that Pelican itself can do the non changing stuff, but that it is really hard to find themes or examples that aren't blog oriented.

But I'd be keen to hear about the kinds of things MiddleMan etc does better than Pelican for non blog sites though....


I have been doing this since 1998 with perl and cron for my own systems and I can say it works great. Combine that with a ram disk and I can handle the load with a couple small machines that would typically require a big farm using dynamic content for everything.


I'm currently building my own static site CMS. Basically. Word press like interface (at some point). Spits out a static site. My blog http://adnanissadeen.com runs on it. I'm building the CMS over at https://github.com/spartakode/static-cms . Been inactive for a while because I'm just a horrible procrastinator. I'll be finishing up (read: making it usable by public) end of this year hopefully. Will try and offer a hosted service a little later.


Considering a lot of content seems to be increasingly generated and handled via Javascript, what are some of the downsides to statically generating a website or webapp?


I remember this custom CMS written in Perl mason that we used at JupiterMedia ( formerly internet.com ) back in 2004. It worked well but had some complexity.


How many more articles about static sites being the "Next Big Thing" before they just become "The Big Thing"?


A bit of web history, mention some static site generators, end by saying their tool is the babel fish of static site generation.


Movabletype anyone? Its older than wordpress.


MovableType was a self-hosted static site generator. Blogger too, although not self-hosted.


just an ad for Netifly


I'd be happy with a wordpress plugin that rendered static pages. 95% of client sites I've ever done have little to no reason to be dynamic. I know there are valid reasons for a dynamic site however IMHO those are rare.


Next to geosites.com, I suppose.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: