After I discovered Netlify, I'm kind of thinking "why bother". It's free, I just push to my repo and they take care of all the building/publishing/hosting/CDNs, and they're very responsive for support and have high availability. I'm a very happy customer (or rather leech, as I don't pay anything).
I use Firebase with Cloud Functions for form handling and other data related tasks (like caching API calls). I'm loving the integration between the two.
Firebase looks really neat but it’s a shame you can’t modify the strict-transport-security header.
I have a preloaded domain that would no longer be eligible (Chrome would keep it around for now, Firefox and their list consumers would prune my entry) for the HSTS Preload list if I were to use Firebase due to lack of ability to include the includesubdomains and preload tokens. :/
I totally agree. I just started hosting all of my static sites on Netlify, and it's so much better than the custom deploy script I had written. I registered a domain with Namecheap, and was hosting a simple index.html on Netlify within 5 minutes (including SSL.)
I am a recent convert to Netlify and I love it. A++. I have never had SSL on my personal website because doing so is such a hassle; not only does everything Just Work, but getting that set up was trivial.
I have a shell script in my Jekyll code that does an s3 sync and invalidates the Cloudfront stuff. Why do the extra step with a Lambda script that also needs maintained?
Please know that GitLab also comes with GitLab Pages. That being said if you're comfortable using a SaaS service Netlify does have a smoother user experience.
Checking my monitoring logs I have 26 minutes of downtime attributable to Netlify since 03 March 2017. That's about 99.994% uptime.
If I attribute every single outage. That is 43 minutes since 03 March 2017, though some of those were not Netlify errors, but even so, if we are uncharitable/inaccurate and attribute them to Netlify that works out to 99.98% uptime.
Not bad for a free service.
They're also very transparent about service issues:
And support is very responsive. I recently talked to them about deprecating TLS 1.0 and 1.1 for instance, or providing the option to force TLS 1.2 if users desired. They were quick to respond and helpful.
So if you are seeing unusual issues, get in touch with them, even if you're a free user they'll still talk to you.
Another happy Netlify customer here. I work as a web dev so not only do I run several of my own sites there but I manage production clients' websites on Netlify as well. Couldn't be more pleased with the service!
Yeah, it's pretty good, and they give you free concatenation of assets. Not really necessary with HTTP2, but it's nice that I don't have to bother configuring my site generator with webpack.
What a nightmare. I'm sure there are use cases for a setup like this, but this is not the system I'd like to maintain. I use Jekyll because of it's simplicity. I edit my site in my favorite text editor and rsync to shared hosting.
Couldn't agree more. I love tinkering around with things and building setups just to learn, but for something like a personal webpage I just want simple.
I use hugo and then setup my output folder (where the generated output goes) as a git repo. Generate the site -> git commit/push -> Caddy. Caddy has a feature to pull in content from a git repo so when you couple that with the built-in Let's Encrypt support it makes it dead simple.
I also use hugo. In my setup, though, I have everything in one git repo - config files, content files, theme, output, etc. Nginx exposes its output directory.
I stopped using Caddy when I discovered that it didn't support one of the unusual TLDs I had. Maybe I should give it another go, though. That was over a year ago.
What do you mean it didn't support a TLD? It's a server, how does it need to "support" TLDs? In my experience, you just specify the hostname and it works.
I use a very similar system for static sites that I maintain. I edit my site in my favorite text editor and run 'git push'. :)
It was a bit of work to get everything running, but there's very little to actually maintain afterward. I'm definitely going to start replicating the setup elsewhere (including my own homepage, which is currently down due to its VPS having failed hard and me not having enough time to rebuild it).
Agreed on this front. Github pages is super simple with Jekyll. Literally push to your branch and it's deployed. This seems a bit overkill. Still interesting though.
I deploy my websites on my own VPS at DigitalOcean, but with a simpler setup then the TFA.
I simply have a Travis-ci.org setup that does an "rsync" after "jekyll build". And in case I'm not in the mood for waiting on Travis, I simply "rsync" from my localhost after build.
Having an automated build system has advantages though - if you get a PR on your website repository with typos and so on, you just have to merge it and the content will get published, so you can do it over your phone. And yes, I had PRs, since I publish 2 project documentation websites this way.
Also folks, you don't need a CDN or Cloudflare, or any of that — you just need a healthy Nginx setup hosted at a decent VPS provider. I've had my websites withstand Reddit and HN level traffic just fine, paying $5 per month for hosting about 4 static websites, plus other stuff.
I also hate Medium, Blogger, Wordpress and any of that crap, I hate their bloat and trackers and I do think having your own website published in a Git repository is worth it. Yes, there is a cost in maintaining my websites, but I do so willingly, because they are mine.
That's pretty much what I ended up doing as well, albeit with my own static site builder scripts. Each time I build my site[1], the deployment script is just a directory rsync. No mess, no fuss.
You are also right about not needing a CDN. My site has occasionally become momentarily popular and my $5 hosting VM hasn't even blinked on my completely static site. A database is a fine thing but you don't want to be serving web pages out of one. Thats why I finally ditched WordPress.
I use a git hook on the server to rebuild on push. I do the same for a few configuration files (the hook takes care of copying the file and reloading the daemon). If you're the only user, it's perhaps the simplest (and fastest) option, although it does mean you need a git client to update stuff. Thankfully, there's SGit for Android.
I build an nginx Docker image that serves the Jekyll output and push that. I have an Ansible docker_container task that pulls and runs it. Done, couldn't be happier.
Cool! I could see why this would work well for teams and collaboration.
Here's my hugo+aws pipeline for my personal blog.
Do editing, with hugo in server mode so I can WYSIWYG edit my pages. Then run a bash script:
#!/bin/bash
# build site from markdown + template
hugo ~/sitedir/
# post to S3 bucket which is a file storage service
aws s3 sync ~/sitedir/public s3:sitedirbuckket --recursive
# invalidate CDN distribution so content delivery is nice and fresh!
aws cloudfront create-invalidation --distribution-id XXXXXXXXXX --paths /*
echo -e "All done"
I like the CodeBuild solution for the times when I'm editing on my phone or a shared computer. I push to GitHub, and CodeBuild handles:
* build (as above, plus asset processing and minification)
* deploy (s3 sync, plus some fiddling to add 301 redirects)
* ping search engines
I keep a lot of drafts and temporary notes in my local checkouts and doing build/deploy on a fresh checkout helps to ensure they don't slip onto the public website.
To refresh the crawled cache the search engines have. I've never given it much thought, but I guess it would be refreshed when queried. I will add it to my bash script for funzies.
Interesting setup. I found myself looking for a static site deployer a few months back and decided to build my static website with hugo [1] and deploy it from my private GitHub repository with the help of github-bucket [2]. So far it is working fine and I can even move away from GitHub and even AWS with just a few modifications.
The c9 setup is really nice for online page editing and to compile the static pages. I am not quite sure, if I would really use it in my current workflow.
Can someone help explain to me the appeal of static site generators? Functionally it's still a dynamically generated site, just that cache population/invalidation has turned into a user-controlled manual step.
Is this because of people wanting to host on static-asset only servers (GitHub Pages, S3 Website, etc) or is there some other benefit above simply using any standard blogging software? If it's a question of speed, that's what caching does.
There are quite a few reasons, if we compare 'self-hosted' setups.
It's simpler on the server as you say, you can serve the files from pretty much anywhere. You also need less resources to do so. I appreciate the cache idea, used to do it myself with wordpress, still do with MyBB, but it's imperfect and there are always misses, espceially if someone is actively cache busting you to DOS your site.
Much less hassle to setup. If you're going to do it 'properly' you're going to want to set up your dynamic site to run in a chroot jail, run the php process under a unique user per site (especially with nginx), setup unique database users and databases per site, secure your credentials, have a version control setup and update functionality and on and on. There's a lot to do. You can automate it (I have) but it's still annoying and requires monitoring.
You can move almost anywhere almost instantly, with just a git push/rsync and a DNS change.
Hugely reduced attack surface. It's literally a collection of text files.
The benefits are somewhat reduced if you get someone else to manage your hosting for you, but it remains simpler to move and usually cheaper to host as you need no database service.
In terms of cache busting comment, that's purely a configuration thing. You'd just make your cache objects never expire and fully warm the cache yourself (again, via some sort of plugin). WP itself wouldn't exist to the client it'd literally be a static site.
Anyway, that aside: So it's literally just the desire to have a zero-footprint blog. I can appreciate that notion but I'm surprised this trend came about with brand new tools as opposed to just packaging up the output of existing blog generators.
Sounds like you should build a WP plugin that generates a static/exportable site. You know the space and clearly understand the market dynamics.
> In terms of cache busting comment, that's purely a configuration thing
As long as you don't have functionality that relys on it though, like search. I've seen that used to DOS a site before. And pingbacks. Sometimes even comments (though to be fair, you can turn those off for parity with a static site).
Plus you'd probably want to do the caching up a layer and not rely on a plugin, maybe a varnish cache or Fast-CGI cache which adds complexity and cache invalidation etc etc. W3TC and ilk are good, but to get the most from them you need to have good control over the server environment, especially for object store, and you probably want to integrate them with varnish server or something anyway. And before that even just for running the site you'll need to tune php and mysql. Not to mention 'fully' warming the cache on a large dynamic site can take quite a while, if it can be done at all.
Something like Hugo can output thousands of pages in milliseconds. That kind of performance just can't be found in a dynamic site, so warming the cache will always take longer than generating a static site like that.
I guess there's just more to be aware of.
> So it's literally just the desire to have a zero-footprint blog.
I don't disagree but I'd say it probably goes further than that. It's just so simple to go static. If you really dig into hosting a dynamic site there is a lot to do to make it work well under most conditions.
> Sounds like you should build a WP plugin that generates a static/exportable site.
There are actually a few good ones out there. I just used on to archive a site. Wget was flaking out with converting srcset urls (even when I compiled the 1.19 branch which was supposed to fix it) so I used a plugin to export the site.
Overall, I don't really promote one over the other, they're tools at the end of the day and if one works for a workflow then it's the best!
Version control matters, too. Ever seen a whole bunch of Wordpress content lost forever due to some sort of site failure? I have. Editors matter. I'd rather use a good professional editor like Sublime for large-scale writing, than some wonky web page thing.
Web sites are software products. If you think like a coder, you want to run them like a coder. If I were just a writer, I'd probably think very differently.
I just created a website for a friends construction business. Gatsby on S3 with Cloudfront.
New thing for me was using Gitlab CI/CD. I taught the 'customer' how to edit on gitlab website and do merges. Now changes are deployed automagically without needing me to get involved.
Best part, no wordpress databases I need to worry!
Love publishing (static) websites with Jekyll. If you're looking for more free (open source) ready-to-use (fork) Jekyll themes may I highlight the Dr Jekyll themes listing / directory [1] - a Jekyll site itself ;-). 200+ themes and counting.
[1] https://drjekyllthemes.github.io
Does anyone here have an objection to using github pages? You can just point your domain to your github repo and it’s HTTPS by default and from what I can tell fee to host on public repos.
Maybe I’m unaware of potential issues for a static site?
Github Pages is missing a couple of features, namely:
* Build systems besides Jekyll
* HTTPS for custom domains
...but the second problem can easily be solved with Cloudflare and the first one can be solved with a git subtree push. Beyond this it's pretty fully featured.
Standard disclaimer: Cloudflare only secures the communication between the visitor and Cloudflare's network, not between Cloudflare and GitHub Pages when using a custom domain and the 'Flexible SSL' feature Cloudflare provides which you are talking about.
So an attacker can still alter/intercept content between GitHub Pages and Cloudflare before it gets to the visitor.
To some, the illusion of security might be considered more harmful that knowing you have none at all.
For an alternative, GitLab Pages offers HTTPS on custom domains, provisioned by LetsEncrypt I believe.
Netlify does something similar.
Both alternatives also allow any build system you configure.
I would like to know why GitHub doesn't do something similar to Netlify and GitLab. Surely it can't be that hard if some what smaller companies can already implement this?
Also, if using CloudFlare on a static site that collects no form data and only has links to external websites, would it matter so much, or is it just as important to remember about the potential harms?
I'm not sure why GitHub doesn't offer TLS over custom domains. They may have some limimtation in how the Pages system is built that means reworking it to introduce this function might be prohibitively costly at the moment.
The answer to the second question is "I guess it depends".
In one way I think it's more about perception - https should be https and https should be secure. Not https sort of halfway along the connection, then clear and unsecure for the rest over the public internet.
That's what a lot of people have trouble with over Cloudflare's particular popularisation of this 'broken' https model.
Also, all the data hoovered up by NSA et al puts a picture together, maybe about a person, their habits, what sites and content they read etc etc. Thanks to SNI https will likely leak the domain, but other than that it'll secure the rest of the info.
And what if (like I do on my site) I share a PGP key fingerprint? What if that's midified over the insecure portion of the connection? Now any communication by that route might be compromised.
I get that it can be seen as pedantic, but all steps in the connection as a whole need to be secure if https is to remain trusted.
I suppose overall the push is (and should be) towards default encryption and privacy for the visitor. That's something I'd support at least.
I've recently setup a similar system for my personal site (middleman) but opted to use code pipeline - this way I don't need to pay for the EC2 instance.
Looks like the / "a" private ssh key which allows pushing to github and trigger the automation thingy which eventually enlivens things needs to live on cloud9 to make this happen.
Sounds "just fine" for your own personal blog, sure…