Google Page Speed scores sometimes give bad advice that can actually slow down page load speed. I just focus on load time and the experience of the page loading. I sort of wish Google would kill the tool off or give it more context. Instead they just have a disclaimer at the bottom that says it's a recommendation and doesn't take into account experience...but try explaining that to a client. Any static site is going to load fast though, it's when you have to make database and API calls when things get hairy.
If a client wants a good page speed score, you can explain why optimizing for that may negatively impact other things. If they still want it, why not just do it and charge them for the hours? You could also give them a different tool to try, like webpagetest or gtmetrix.
Client have heard that load time affects SEO ranking. This is official Google's tool so client believes that the score here affects the rankings so they need to get 100 by any means.
Technically not, but for SEO ranking, it's one of a hundred factors that even if you max it out, it's a rather small one:
https://searchengineland.com/seotable
Talk about diminishing returns here.
Source: We've got an Angular 1 SPA as a website that despite maxing out in the 70s for page speed (because we can't get rid of 700k of blocking JS) still ranks like hell, because we do all the rest right.
Google actually doesn't use their PageSpeed score -at all- in search rankings. They may factor in speed, but it's not the PageSpeed score. Moz did a study with this a year or two ago and could find no correlation. There was a correlation with TTFB though.
A very basic example: if you had a large site, you can raise my Google Page Speed score by inlining all CSS and JS, but that would be stupid because it doesn't take into account the next page load and caching. I've seen some render blocking optimizations slow the overall page too. Getting rid of query strings is nice, but sometimes you need it for cache busting and other things, but it penalizes your score. Even Google's own tag manager/analytics scripts lowers your page speed score when they are on the site.
>> Even Google's own tag manager/analytics scripts lowers your page speed score when they are on the site.
I never understood why Google doesn't fix this. I want to use Google Analytics but their own tool penalizes me for it because it gets unproperly cached on Google's own servers? That's just stupid.
It's just a tool with a random score - why does it matter? Most of the recommendations are now outdated. Focus on overall experience and there are plenty of RUM monitoring scripts you can use (that take advantage of the very fine grained performance timings available in browsers today).
Inlining assets is even less important and more of an overoptimization as HTTP2 pipelining takes over. Its like threading..used to be expensive and a sign of bad code whereas now, since everyone has cores to spare, its a sign of good algorithm use.
No, I understand HTTP2 and the advantages server push brings (on paper). When you consider it’s inconsistent implementation across browser vendors and that it really works best only in a scenario with optimal connection quality, I’d say it’s bad advice to tell people to stop bundling JavaScript especially in today’s world where an average codebase is likely going to have a ton of modules.
>> Any static site is going to load fast though, it's when you have to make database and API calls when things get hairy
This is incorrect. Static assets are often the source of many problems, due to poor caching headers for static assets, missing gzip encoding, unoptimized images, etc. The time it takes for the backend to return a response will usually result in more overall savings in load times, but it's astounding just how inefficient frontend choices can be. It's not "micro-optimizing" to focus on improving the frontend instead of only focusing on the backend.
That doesn't really relate to what I just said. I'm assuming that you're using compression and have optimized images. But if you took a page, saved it as a static HTML document and compared it to the "live" version the static HTML document will outperform it virtually single time, even on a slower server. That is essentially what this script is doing, and what people do with tools like Jekyll.
Not all types of static files will slow down the loading of a site. JS scripts will block loading of the page. Web fonts will block text rendering (and they are usually heavy). CSS files sometimes can block rendering and sometimes will not. But images do not affect loading time. Even without images the page is readable and usable.
Nothing stopping you from explaining your opinion as an expert on the tech, and if it's extra time (==money) also communicating that. If that is ignored and you're getting paid for your time, that's the best you can do.
This is usually what happens, though in my case my boss (Lead Dev) is already telling them what isn't possible and what doesn't make sense. Some clients just insist for months until we finally give them what they want and they see we're not talking out of our rears. But yeah I understand all that, it's just silly to me to waste time and money doing things that will not be useful for the end-product. I guess I'll be numb to the silliness eventually...
Some of those optimizations become irrelevant (or downright bad) with http2. Inlining CSS ans JS is no longer best practice when you can do a server push of those and avoid sending the extra bytes once the files have been cached by the browser.
Some optimizations can backfire. Adding async to the external JS is a great tip.. unless you have dependencies between your different scripts (i.e. you need one to be loaded before the other one gets executed)
Relevant to http2 push but slightly off topic, I love how I can add a single line of code to my Django project (to include a middleware library I wrote) to have automatic http2 push. The middleware will parse the static media on the page and push them to the client, it's much faster with no effort.
That should take approximately as long as a client<->server ping, right? So shouldn't it be irrelevant for JS/CSS that loads within that time (say, 50Mbps & 15ms ping --> everything below 93KB gzipped?)
Yes, you'd download some things you don't need to. Last I heard, they're working on an extension so the browser can tell the server what media it has when it requests the initial page so they won't get sent at all.
Didn't we all have this conversation years ago already with nocache query parameters etc.? Reminds me of number 6 in RFC 1925: "It's easier to move a problem around [to another protocol layer] than to solve it."
I still don't see how HTTP/2 Push is solving any real-world problems [1] besides Google being able to push ads so that it counts as an impression even if the client blocks ads.
[1] I don't count the slightly faster initial page load since I'm getting that junk on every page load via push even if I already have it, so I'm paying with my data allowance. If you want your site to load fast, just fix your goddamn bloat.
Unfortunately it doesn't. There is no mechanism right now for the browser to cancel the pushed resources.
H2 Push's Cache Digest will introduce this ability, but it's still being designed.
You might better be working on caching so that the resources are loaded only the first time. Otherwise you need some way to know whether the user has the resources cached or not (a cookie maybe?).
Asynchronous JS scripts are a great choice. But you better makes this choice on early stage, because changing scripts from sync to async on a large legacy website can require a lot of time, there will be many bugs etc. You cannot just add "async" attribute.
That's something that really astounds me with this async thing.
Why on earth when they introduced it they didn't add a way to tell the browser the order of execution ? Or may be it exists and I haven't heard about it ?
There is a way: ES6 modules. If you can account for the order of execution (with modules or custom loaders) async is, in almost all cases, the best way to load your scripts. Defer (or body-end) scripts also specify the execution order, but only in a strictly linear fashion which usually has a performance hit.
We love Middleman for static sites; curious why you chose to go with a server instead of S3+Cloudfront or Netlify. One of the biggest reasons we use static site generators is the ability to CDN the whole thing, making it dirt cheap and blazing fast.
I've actually used Netlify numerous times and love it. The simple answer to why I chose to manage my own server was mostly to learn how and have full control.
For what you get out of the box with this setup (or Google Cloud Storage + Cloud Load Balancing/CDN) it actually can be "dirt cheap", depending on your business goals/traffic. Just like anything, it depends on what you value/what your time is worth. The pricing difference would need to be several (several) orders of magnitude larger than it is for me to worry about the few dollars a month I'm spending on each static site sitting in S3/Cloud Storage.
Also, if there is an issue with either of these it's far easier for me to justify downtime to a client by saying "Amazon us-east is down, here's Amazon's status page" then trying to explain who OVH is.
You will need to spend time to set up your site for Amazon or for cheap DO instance, I don't see what you can save here. You can have your own virtual server with root privileges and iptables and custom software and with Amazon you will have to pay for every new feature.
As I said, it really just comes down to business goals. Setting up a shared DO box with iptables isnt worth my time when it's still just a single DO box at the end of the day.
S3 + Cloudfront + ad hoc Lambdas is a far far easier cost benefit ratio when it comes out what my time is worth. To each their own :)
Ha. Well, it's more that there are things I can justify charging for and things I cannot. Standing up a sub par solution for a larger investment of time doesn't make a ton of fiscal sense to myself or my clients.
Yeah indeed, for contract customers, better go with something you're confident everybody will already know, since other people will have to manage it when you won't be there anymore.
In a previous startup I was CTO and founder for, I used a gentoo and managed everything possible by myself. This was not especially costly in time because I was already used to it and efficient, and we managed that way to stay for 3 years with 70€/mo in infrastructure cost. This has its advantages too :)
It is a shame that you dont get into details technically. The article now is basically the same as looking at the recommendations of google page speed itself.
Thanks for getting back to my comment. Reading back my comment I might have been a bit harsh, I am sorry about that. The level of detail in the compression and caching bits were very detailed, thanks for that.
What I was most interested in technically is the part about 'Prioritize visible content'. You talk about "loading third party scripts later, and generally keeping the above-the-fold content small and fast". From this I make out that you do not load images below the fold, etc. I was wondering what you do technically to make this happen. Also, how would you load fonts after the first paint, etc. Also, from what I remember, when you inline your CSS you make caching the CSS separately from the HTML impossible. What would you do when you want to have a quick first paint, but also want to leverage having CSS in its own file?
One odd thing I bumped into is Google speed suggestion "complaining" about leveraging browser caching, but it reports Google Analytics as an offender :-/
This has been the case for what feels like forever now, and Google have on several occasions clarified they have no intention of changing the cache time on these files to "fix" the Page Speed score result for sites that use Google Analytics.
I personally find that most of the workarounds for this are usually more technically offensive than living with a marginally lower score on this often somewhat arbitrary test.
Has anyone else had trouble using pagespeed insights recently? I think they changed something about their API, every site I try to test gives me "The referrer https://developers.google.com/ does not match the referrer restrictions configured on your API key. Please use the API Console to update your key restrictions."
But loading external fonts does take additional time. So to say, oh yeah, I have GA and fonts and I'm still 100 sounds disingenuous because it could still be faster.
Context is key. Nobody should be shooting for a perfect 100 in any situation where a barebones static HTML document won't suffice. So if you have to make DB calls to create the HTML document, or are reasonably delivering a nice user experience (e.g., a custom font), then you should not be scoring 100.
With that being said, you are still doing great for what you are delivering if it's non-static content with fonts and can get to at least 80 / 100.
Who really cares scoring 100/100. Is there a top 1000 site that is even in the low 90's? I love optimizing pages for speed but to claim 100/100 isn't going to achieve some magical jump in organic rankings either because of decreased page load speed.
People care when they have paid you to optimize their site but they only get a score 86/100.
Trying to explain the page speed tool is just a 'rough guide' and that a website always has to be some kind of compromise between speed, functionality and design falls on deaf ears because 'google' is, well, 'google'.
I agree but like everything with optimization and refactoring, the cost to move up towards a 100 offers less ROI in the end. I think if you're at 90 that's solid, no desire to be in the 100 club nor would I tell a client you need to be there.
I was at a top 1000 site and the senior product manager offered free dinner to be the first team to hit certain metrics (like pagespeed). He knew 100/100 wasn't useful but the difference between 50 and 90 for us was fairly significant. Are you chasing a number or what the number means? The latter is a good thing.
Yea my casual survey/experience indicates optimizing beyond high 80s isn't going to affect search ranking much. I'm obsessive about optimizing page speed too, but maxing out their score requires some gimmicky (imo) techniques that don't really affect speed/ux.
Does anybody know how much the actual load time factors into the "PageSpeed"? In my experience, you can have a site that loads quickly but gets a mediocre score because it violates Google's guidelines.
It really depends on the specific site. The gold standard is to open up the network tab of your browser's debugger to dig into the runPageSpeed API call that gets used behind the scenes. Specifically, dig down to .formattedResults.ruleResults and look at the ruleImpact of each category. It's not an exact correspondence to points taken off of a score out of 100, but it's roughly proportional. That is, if you have a rule with a ruleImpact of 20 and another of a ruleImpact of 10, fixing the ruleImpact of 20 will have twice as good of a positive effect on your overall score.
In my experience the most important is file size: HTML + CSS + JS + images (web fonts are not counted) so you can get the most by optimizing these files or tricking the tool.
Gzip compression benefits taper off in benefit after the "7" setting, IIRC. Brotli at higher settings compresses better, but you'd want to do it static as opposed to on the fly (at higher/highest settings)
I've tried that as well, but you need a brotli module for nginx. No problem if you're using Docker, but I just used the mainline distribution of nginx on Ubuntu 16.
I'm curious if brotli's decompression is still fast on high compression levels. This is the case for gzip IIRC.
> I'm curious if brotli's decompression is still fast on high compression levels. This is the case for gzip IIRC.
Yes, it's optimized for fast decompression. Compression, on the other hand, is comparatively expensive. This trade-off makes Brotli very interesting for static assets.
Google Page Speed is an awful tool. For example, it often advices to compress images and even suggests to download "optimally" compressed versions. Those images have awful quality, and have visible compression artifacts. The person who chose this level quality is probably blind.
Also it often gives crazy advices, for example something like "embed CSS code necessary to display the top of the page, right into the page". Not only it is difficult to implement, (what exactly is "the top of the page"? how to understand what part of the code is necessary? what if the user scrolls the page down while loading?), it will make the pages larger and doesn't allow browser to cache embedded CSS. So you have to invest a lot of time and your page will probably load slower as a result.
But at the same time this script doesn't notice obvious things. For example, it never advices to remove or replace web fonts although web fonts are heavy, can cause problems with rendering text (they can lack glyphs, for example cyrillic or CJK characters) and web fonts block text rendering. And they are really unnecessary in most cases because builtin system fonts usually look better at small sizes (at least on Windows).
Also it never advices to remove ads althogh ads often cause high CPU and memory consumption. Why is that I wonder.
The tool always requires to move JS scripts to the bottom of the page. But is it always the best idea? I think in most cases it is not. For example, if you have a button with an onclick handler, it will give a JS error if the button is clicked before the scripts in the bottom of the page have loaded (the problem of dead JS controls). And if you put a small JS script at the top of the page, there will be no error. And what if you have a SPA? In this case the earlier you start to load the scripts the sooner the user will be able to see the data.
Also it always recommends to use a CDN which is not always good idea. For example, if you have a site in Russia or China and move your static files to CDN then it might be farther from the user than your server and even worse, it can get blocked so your site won't even be loading. Adding a CDN often doesn't make performance any better but reduces the reliability and increases your hosting bill.
So Google's script might help you in identifying problems with your site but you should not ever blindly trust any of the recommendations. Please leave client side optimization to people who have expertise in this area.
By the way I had to lessen the image quality to get a better score (and foolow other useless or even harmful recommendations) because our management believes that the score given by this tool (it is Google's official tool after all) affects SEO rankings and obviously they are more important than image quality.
Also it recommends to "leverage browser caching". I.e. set unconditional caching of every static resource for a week. Please remember that it means you will have to implement a solution to reset the cache when you deploy new versions of static files. It might be quilte costly to implement correctly on a legacy site. By the way you can trick Google by enabling caching for a week and then appending current time to every link (http://example.com/file.css?t=12:00:00). In this case you will get a good score AND your users will always get the latest versions of the files.
The article says "Then, inline your CSS and JS instead of making external resource calls to them." and this is actually what makes your page heavier.
> Finally, I was having a lot of trouble getting the Google Analytics script on my site to not be render-blocking
It is easy. Don't understand what was the trouble.
it often advices to compress images and even suggests to download "optimally" compressed versions
Compressing images is useful, and this advice can be beneficial – though I'm sure it's more effective to optimise them yourself rather than relying on the Google-generated versions.
"embed CSS code necessary to display the top of the page, right into the page".
This is good advice. For larger sites with more extensive stylesheets, this approach allows the initial 'above-the-fold' to be rendered better before the full stylesheet is loaded. Just because it doesn't hit every case – like when users scroll during load – does not render it ineffective. It's also neither difficult to implement, nor does it add enough load time to pages to be a practical issue (unless you're inlining 100kB of text, in which case – rethink your stylesheet!)
For example, it never advices to remove or replace web fonts
It would be a bit silly to do so; web fonts are added on purpose, and advising that they be removed is pretty useless advice, along the lines of "you can speed up your website by removing images". It would be nice if the tool supported analysis of web font use though, such as testing to see if particular styles are used, or if the font could be effectively subsetted.
Also it never advices to remove ads althogh ads often cause high CPU and memory consumption. Why is that I wonder.
Again, it would be useless information to tell someone who has deliberately added advertising to their website to remove it.
The tool always requires to move JS scripts to the bottom of the page. But is it always the best idea? I think in most cases it is not.
Yes, it is – either that, or loading the scripts deferrably or asynchronously.
it will give a JS error if the button is clicked before the scripts in the bottom of the page have loaded
This is only the case if you write bad Javascript – binding to DOM elements and their events should be done asynchronously.
And what if you have a SPA? In this case the earlier you start to load the scripts the sooner the user will be able to see the data.
If you have an SPA, then either there is no real markup on the page (and the difference will be minimal) or you are pre-rendering it, and the content will be visible anyway.
Also it always recommends to use a CDN which is not always good idea.
Using a CDN is almost always a good idea unless you have a very specific reason not to do so.
So Google's script might help you in identifying problems with your site but you should not ever blindly trust any of the recommendations. Please leave client side optimization to people who have expertise in this area.
I would say this much is obvious – it's a tool, and like any others, experienced professionals can use it to make appropriate decisions. It can help with that.
SEO rankings and obviously they are more important than image quality.