Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

most satellite internet users won’t give up just because your page is slow

My point is that 'slow' is a relative measure against other websites, not an objective value. So long as your website is faster than a typical website then users won't give up. Trying to be faster than average is worthwhile, but trying to be as fast as possible might not bring any additional value.

It's also possible (likely, in my opinion) that the website itself is important too. People expect Google to be fast because it's made by the biggest internet company there is, and perhaps because the page itself is very simple. It's just a textbox on a page. Surely it should be fast. That doesn't mean people have the same expectations for grannys-discount-wool.com.

This is a complex problem that probably doesn't have a straightforward answer. Using the mantra 'make your website as fast as possible' means you won't get it wrong, so it's worth doing, but you almost certainly get diminishing returns for that effort as you optimize for things like shaving off milliseconds and bytes.

Unless you're Google.

Although... that said... if Google really thought sites should be as fast as possible, wouldn't they make a smaller GoogleTagManager.js script?



This is a complex problem that probably doesn't have a straightforward answer. Using the mantra 'make your website as fast as possible' means you won't get it wrong, so it's worth doing, but you almost certainly get diminishing returns for that effort as you optimize for things like shaving off milliseconds and bytes.

Yeah, that's fair. My point (really just repeating hearsay -- I should check that Google research still stands up!) is just that load time optimization apparently has a bigger benefit than you might expect, and the diminishing returns don't kick in until much later than you'd expect. Edit to add: and the post being discussed has a good point that there's a performance shelf at ~14KB, so a small improvement can have a big impact.

Although... that said... if Google really thought sites should be as fast as possible, wouldn't they make a smaller GoogleTagManager.js script?

A good question! They're definitely not the most consistent. The left hand says one thing, some of their thousand right hands do the exact opposite...


Might be a competition thing. It would take a big delta in time-to-load for me to switch from Google to Bing. But if wsj.com takes 1s more than cnn.com maybe next time I pull out my phone on the train I'll switch publications.


>My point is that 'slow' is a relative measure against other websites, not an objective value.

No, it is an objective measure that is rooted in human biology and psychology. Our perception of HCI is based on various response time limits, that define whether we consider interactions seamless or retaining our attention. For the websites those response time limits are often hit in various circumstances, even in urban areas of developed countries (e.g. poor cellular network coverage in a building results in reduced speeds and higher bounce rates).


>My point is that 'slow' is a relative measure against other websites, not an objective value. So long as your website is faster than a typical website then users won't give up.

You're competing against more than other web sites, and data shows this isn't true at all in any case. There are certain "thresholds" beyond which users will stop visiting much more frequently, but yeah.


https://danluu.com/ loads super-fast for some reason. I definitely noticed it. I bookmarked this blog partly because it loads so quickly.


Actually, I was thinking of prog21.dadgum.com




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: