I wonder if anyone has studied the impact of latency on user behavior while considering the impact of user expectations from their typical connection speed. Whenever I see an article about page speed optimization, the assumption is that a user will give up if a page takes too long to load, ans that everyone gives up after X seconds. Usually X is about 7s based on a Nielson article from years and years ago.
The thing is tbough, a user who frequently uses satellite Internet or a 2G mobile connection will learn that pages take a while to download over that connection, and they will adjust their expectations accordingly. Satellite Internet users aren't giving up on pages every 7s. They're waiting because they know the page will load slowly.
I suspect most users wait for a page if most pages are slow. So long as your website is no slower than the average then you're probably not losing many visitors.
Obviously that's not to say you shouldn't make your website as fast as you can. You should. Everyone will appreciate the effort. But don't assume that knocking a few seconds off the TTI time will actually impact your conversion metrics. It probably won't (but only a proper study can prove it either way).
I don’t have a citation handy, but this is something that Google has famously studied. They claimed years ago that a tiny increase in the loading time of the Google home page led to a measurable decrease in the number of searches (like, a few percent).
Not everyone is operating “at Google scale”, of course, but in aggregate the effect is real and faster-loading pages have better metrics.
So, most satellite internet users won’t give up just because your page is slow, but some of them will, and even the ones that persevere will likely appreciate a faster page.
most satellite internet users won’t give up just because your page is slow
My point is that 'slow' is a relative measure against other websites, not an objective value. So long as your website is faster than a typical website then users won't give up. Trying to be faster than average is worthwhile, but trying to be as fast as possible might not bring any additional value.
It's also possible (likely, in my opinion) that the website itself is important too. People expect Google to be fast because it's made by the biggest internet company there is, and perhaps because the page itself is very simple. It's just a textbox on a page. Surely it should be fast. That doesn't mean people have the same expectations for grannys-discount-wool.com.
This is a complex problem that probably doesn't have a straightforward answer. Using the mantra 'make your website as fast as possible' means you won't get it wrong, so it's worth doing, but you almost certainly get diminishing returns for that effort as you optimize for things like shaving off milliseconds and bytes.
Unless you're Google.
Although... that said... if Google really thought sites should be as fast as possible, wouldn't they make a smaller GoogleTagManager.js script?
This is a complex problem that probably doesn't have a straightforward answer. Using the mantra 'make your website as fast as possible' means you won't get it wrong, so it's worth doing, but you almost certainly get diminishing returns for that effort as you optimize for things like shaving off milliseconds and bytes.
Yeah, that's fair. My point (really just repeating hearsay -- I should check that Google research still stands up!) is just that load time optimization apparently has a bigger benefit than you might expect, and the diminishing returns don't kick in until much later than you'd expect. Edit to add: and the post being discussed has a good point that there's a performance shelf at ~14KB, so a small improvement can have a big impact.
Although... that said... if Google really thought sites should be as fast as possible, wouldn't they make a smaller GoogleTagManager.js script?
A good question! They're definitely not the most consistent. The left hand says one thing, some of their thousand right hands do the exact opposite...
Might be a competition thing. It would take a big delta in time-to-load for me to switch from Google to Bing. But if wsj.com takes 1s more than cnn.com maybe next time I pull out my phone on the train I'll switch publications.
>My point is that 'slow' is a relative measure against other websites, not an objective value.
No, it is an objective measure that is rooted in human biology and psychology. Our perception of HCI is based on various response time limits, that define whether we consider interactions seamless or retaining our attention. For the websites those response time limits are often hit in various circumstances, even in urban areas of developed countries (e.g. poor cellular network coverage in a building results in reduced speeds and higher bounce rates).
>My point is that 'slow' is a relative measure against other websites, not an objective value. So long as your website is faster than a typical website then users won't give up.
You're competing against more than other web sites, and data shows this isn't true at all in any case. There are certain "thresholds" beyond which users will stop visiting much more frequently, but yeah.
We ran some A/B tests where we intentionally slowed our site down (at the webserver page generation level) by set percentages (0%, 10%, 25%, and 33% IIRC). Those would be against a baseline page generation time of around 2000ms, so the slowdowns were not small.
All tests showed a drop in traffic, conversion, and average session value indistinguishable from zero. Our hypothesis is that our site (where a converting visitor is on average designing and ordering a custom product) is one where the difference between 30 minutes and 31 minutes is not meaningful and a lot of the time is spent in the web-based document editor. It would not generalize to a "give me a quick search result", "sell me an item from a warehouse shelf", or "let me argue with someone on a forum".
This was of course disappointing, because we ran this test as a sanity check after having done an extensive project to improve the page speed across the main conversion funnel. The project was technically successful, but a business zero (probably for the same hypothesis as above), so we decided to test in the other direction, because that's easier to implement than to force faster pages.
Was it linear though? I understand how you can lose 50% revenue by 5 seconds delay per click. But thinking that this has no “breakpoints” is naive.
Also, personally when shopping for e.g. jeans and tshirts I’d rather just wait for all items (json) and thumbnails to load (they can do that in bg except for the first page) and then filter/search/pagination would work instantly. You can lose half a hour in total by dealing with these shitty filtering sidebars which reload everything every time. Why aren’t e-shops then just semi-local apps over an on-demand synchronizable full-blown localstorage db? Nobody does this sort of precaching. Do they know how much revenue they are losing? It doesn’t add up.
That's a claim that they made in 2006. Could there be a reason why Amazon said that a slow website would have a massive impact on ecommerce sales back then? Was there some other Amazon product that would greatly benefit from people believing they had to use the fastest hosting service possible for their online store?
Well then, since Google these days is slow as hell on slow internet, I guess users with slow internet are no longer important.
When trying to find something on Google while visiting smaller towns in Germany, it's often down right unusable. Then again, I've been used to broadband and 4G for many years now.
Page / ad impressions is another metric for google; that's one reason why they have made such big improvements in the speed of internet and browsing with Chrome, their own DNS, HTTP/2 and 3, etc; the reasoning being that the faster people can browse, the more ads they will see, the more revenue Google gets. And it bought them a lot of goodwill.
But this may apply on a smaller scale too. For Amazon, it's giving people less time to rethink their impulsive decision.
At one of my recent jobs (EU-wide consumer healtech) we managed to reduce bounce rate by addressing a number of issues indicated by DevTools, mostly focusing on time to render and time to interact. It indeed positively impacted conversion and I guess this is a common knowledge among marketing teams by now, that this works.
I think it's fairly obvious that a faster website is better. I'm not disputing that. What I'm saying is that there is a point where improving your website when it's already better than competing websites stops giving you enough value to be worthwhile. If you're in the bottom half of websites then you're giving up money and you need to improve. If you're in the top half, or top quarter, or whatever then maybe that engineering effort could be used to generate more revenue in more effective ways than just cranking up perf.
You could probably improve speed forever and always see a measurable improvement, but if that's costing more than you're seeing in added conversions, or you're doing it instead of improving features that would net a greater return, then you're doing the wrong thing.
>What I'm saying is that there is a point where improving your website when it's already better than competing websites stops giving you enough value to be worthwhile.
There's indeed a point where optimizations stop yielding meaningful results, but it is certainly not connected to the performance of competitor websites. As soon as an user lands on your website, only its own performance and accessibility matters and positive ROI can still be achieved well beyond the point when your start doing better than competition.
Where did you get this idea about competing websites from?
Not competing website but other average website in general. Say that averagely HN load for 1000ms and it's positioned around top 15% fastest. And if your website load for 1100 ~ 1200ms will be okay. OP's point is, if you're able to speed up to range of 800 ~ 900ms it may not resulting in much higher conversion.
However if your site is at 1800ms range it's worthwhile to increase to 1200ms range
I don't think that it works this way. If bounce rate matters for you, because your website is part of sales conversion funnel, then you do not benchmark against random "average" website like HN, because you do not build your site the same way, your audience is likely different from many different angles and you cannot really measure the response time under representative conditions (because you likely do not know them). The only sensible benchmark you can have is the performance of your own website. It makes sense to collect some low hanging fruits, but further optimizations generally make sense when you have enough traffic to reach statistic significance in A/B tests. And those tests will not always result in more optimized version, because content matters too and better content may outperform faster version in terms of conversion, so it will often be two steps forward and one step back.
> I wonder if anyone has studied the impact of latency on user behavior while considering the impact of user expectations from their typical connection speed. Whenever I see an article about page speed optimization, the assumption is that a user will give up if a page takes too long to load, ans that everyone gives up after X seconds. Usually X is about 7s based on a Nielson article from years and years ago.
N=1 but I think I am quicker to close a tab with a website that does lazy loading and/or hide stuff for a blink second or three just after showing me the thing I am here for than a tab with a slow loading.
Seeing things like "SportsShoes.com found that faster-than-average mobile visits were 41% more likely to convert than slower-than-average visits." suggests I might be right...
Back in 2006, Amazon found that every 100ms of delay cost them 1% of sales. Google similarly found that a 500ms increase caused search traffic to drop 20%.
These are older anecdotes, but there's no reason to think that people have gotten more patient as internet connections have become high-speed.
The thing is tbough, a user who frequently uses satellite Internet or a 2G mobile connection will learn that pages take a while to download over that connection, and they will adjust their expectations accordingly. Satellite Internet users aren't giving up on pages every 7s. They're waiting because they know the page will load slowly.
I suspect most users wait for a page if most pages are slow. So long as your website is no slower than the average then you're probably not losing many visitors.
Obviously that's not to say you shouldn't make your website as fast as you can. You should. Everyone will appreciate the effort. But don't assume that knocking a few seconds off the TTI time will actually impact your conversion metrics. It probably won't (but only a proper study can prove it either way).