Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ooof, 115kb gzipped. That's not exactly featherweight.


Tell me what I'm missing here - my project requires a download of N resources over N different hits to get them. Now in theory I host all my dependencies like css file, font-files, images and js files. Assuming I turn on gzip at my web server (nginx lets say) and I set up cache headers such that every resource file downloaded does not expire for 1 year, then the files download 1x for every initial page load. The first time is a hit not unlike downloading a decent sized image file which we all do without complaint on a daily basis.

Sure, I get it - a whole bunch of stuff has to come down before Document.onload() can be called and thats time from a JS perspective the user is looking at nothing. What I like to do is setup the default screen to be a fixed position blocking layer with my company logo on it and the last thing Document.onload() does is remove it. The initial load they see it for 1.4 seconds. Subsequent reloads from browser cache it's up barely long enough to notice.

So the problem of initial load is easily managed and not that big of a problem in the first place honestly. Simply setup caching on the web server to persist the files as long as possible ( I think a year is the max in most places now ) and problem solved right?

Now occasionally I run across some browser on some platform thats greedy about caching and refuses to pay proper attention to cache headers and replace them with newer versions when they come down. Its a little more of a pain but still well worth it to simply add a fingerprint to each remote resource fetched:

<link rel="stylesheet" href="/foo/bar.css>

becomes:

<link rel="stylesheet" href="/foo/bar.css?id=FINGERPRINT">

Now at build or deploy time I incorporate a little script that generates the current time in seconds and uses sed to replace FINGERPRINT with the current time in seconds (123888238823482834) such that browser sees:

<link rel="stylesheet" href="/foo/bar.css?id=123888238823482834">

This is a unique URL and forces the browser to pull down the new asset. This is easy to do with resources pulled down in the <head> section and more problematic with images however you just change the image name from "image" to "image_v2" on edits or changes and the problem goes away. Its easily enough done to iterate the image directory changing "*.jpg" files to a new version number and making the same replacements in html and js files if you really want to get tricky about it.

Now, new files are fetched one and only once per device and the page weight of a particular JS file becomes practically irrelevant.


Loading cached JS can still be very slow. https://www.webperf.tips/tip/cached-js-misconceptions/ has a good explanation of the bottlenecks involved.

> Subsequent reloads from browser cache it's up barely long enough to notice.

Are you measuring the time on your personal machine, or a machine that represents what your typical visitor is using? If you're using a recent Macbook, that's going to have very different performance characteristics than, say, an old Android phone. Something that's instantaneous on a Macbook could take ages on an old Android.


Thanks for the link - I'll read up on that!


> Now, new files are fetched one and only once per device and the page weight of a particular JS file becomes practically irrelevant.

It's not just download speed -- it's the parsing and execution of the script that takes CPU and memory. 115KB of JS is much heavier than e.g. 115KB of JPEG.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: