Don’t agree with this take. Sacrificing the performance for all my users so a minuscule percentage of them can poke around a little easier? All the JavaScript is likely transpiled anyway. They can use dev tools to unminify most of it. Deploying with source maps for production might be a better ask?
How big is your JS that minification has measurable performance impact? In my experience, minification doesn't add much (if anything) on top of gzip + cache for most applications.
Yeah, the way compression works, I wouldn't expect minification to reduce the transfer sizes to any significant degree, as it's basically layering a good compression algorithm on top of a bad one. In general compression algorithm matroshka dolls tend to be inferior in terms of size and speed when compared to just using one good algorithm.
All of the comments on this post that include numbers show that using both is better than just one or the other, mostly when the code’s sufficiently large.
I agree with the recommendation of one good compression algorithm vs combining multiple in general, but is it possible that you hadn’t considered that this is a case where we have a very good lossless compression (gzip) that can get benefits when combined with a lossy compression (minification)? As wonderful as Huffman codings can be, it’s hard to compete with outright throwing “unnecessary” (to many users, maybe not OP and some of us) data away.
Does still sort of does come off as majoring in the minors.
As illustrated by the examples in this thread, the actual benefit of this isn't particularly large, and it comes at the expense of worse developer experience. At the point where this would actually have a real measurable performance impact, your application is so hideously bloated there is without a shred of doubt other things you can do to improve the performance of your application that is more impactful than minification.
Yep. This is basically like saying "don't distributed compiled binaries"
I'm sure some people here will agree with that sentiment, but for the majority of users it's the better option. And if the goal of the project is for the source to be open then they can host that source somewhere for those who want to tinker, inspect, or build their own binaries.
At some point most web apps will probably be blobs of WASM anyway.
"Sacrificing the performance so a miniscule percentage of them can poke around a little easier?"
Wondering how to reconcile this with the never-ending didactic missives from software developers about writing code that others can easily understand. Generally, the affection for verbose languages and disdain for terse ones.
There is also a trust problem. How can we be sure that minification is only for the purpose of performance and not also for the purpose of obfuscation. For example, from the user's perspective, performance is routinely sacrificed to allow for advertising. Users must wait and allow their computer's resources and network bandwidth to be usurped for telemetry, ad auctions and ads they really do not want to look at. This is almost always orchestrated using Javascript.
My HR web service, used to check if colleagues are on paid leave, has a paginated view of an html table of paid leaves, 30 people per page. A single page contains about 2 million times the character "space", which in the vast majority of html rendering, is completely ignored (exceptions when they actually separate symbols and when they're present in raw text rendered as-is).
There so many more ways to improve performance without minifying code.
Do you have any strong example websites? The website you linked is very simple, I can’t see why most pages on it would benefit from using JS at all except for analytics or similar. It’s a good website, I don’t mean to make it sound otherwise, but it’s not the type that should consider minifying code in the first place.
Minification’s more useful for web applications, along the lines of: MS Office (e.g. Excel), Zappier, UberEats, Photoshop, the reporting side of analytics (e.g. Google Analytics) where you get all the interactive graphs and maps.
I think many of the disagreements on here would dissolve if they were only considering blogs and similarly small websites.
Self-reply: I see that the website links to tldraw under Job, which might be a good example to weigh the shipped size vs the same thing minified, except that it's already minified.
Low footprint by default, anyone who wants to inspect then downloads the extra, and for bonus points can even see the original typescripted version instead of the js.
Preempting because I know we tend to be a little out of touch here: yes there are still plenty of cell plans without unlimited data, mostly among the pay per month options.
Very few websites have optimised their assets and number of connections enough that the size of the code is the bottleneck. Moreover, code tends to be highly compressible. I doubt it really saves much bandwidth. All javascript build systems out there already does it, though. Not doing it would require people to go out of their way. I think that's ultimately the main reason people do it.
There are different aspects of minification though, right?
I don't know if you count tree shaking but I do know my main project at work has 60,000 files in node_modules and if all of those were incorporated in my bundle that would be a big problem. The "modern web stack" just wouldn't work without it.
I've done a lot of photo projects where asset optimization is pretty important because it is easy to get in a place where your site costs 4x as much to run as it should and just as easy to get in a place where you compressed files too much and your images suck.
I am getting into WebXR and facing even tougher problems of asset optimization since I do want to support 90 fps on the not-too-terribly-fast generic ARM Meta Quest 3. I'm sure there is more than one reason you can't upload images to Horizon Worlds but avoiding too many textures with too much detail has to be one of them.
Using the largest, unminified, single file I could find in my node_modules folder (which happened to be tsserver.js from typescript, 11mb and 185 000 lines long)
Most people aren't making webpages that need a quarter million lines of JS. For a normal sized webpage, difference between raw gzipped and minified gzipped is negligable.
Or maybe even "modern front end" hello world apps now need a quarter of a million lines because "modern" JS devs use super mega react typescript which installs 1000 npm dependencies and requires a 30 second "build" to generate a huge monolithic minified tree-shook js blob. I wouldn't know, I just use vanilla JS for my web pages.
For a small file (github.com/runk/node-chardet, release version 0.7.0, index.js, 154lines, 4KiB)
Raw - 3.30 KiB
Minified - 2.05 KiB
Gz - 1.01 KiB
Minified+Gz - 0.81 KiB
That's a 25% reduction
Still huge. If minifying before gzipping multiples the size of my transfers by a factor between 0.75 and 0.40, I'm not skipping it for the one oddball that wants the sources in a human-readable format on the production environment
They can use the sourcemaps, or find the files on github
My point is that for most people, the size savings of minification are going to be miniscule because most people are dealing with small JS files. Not only that but it makes debugging a much bigger pain. You start needing to generate and distribute map files, you can no longer easily observe or modify state from Dev console, etc.
I'm not completely opposed to minification. Distributing some self contained library like OpenCV.js? Minify away. But minifying everything even application files under active development just impedes developer velocity IMO for very little benefit. You shave off a couple hundred bytes in exchange for worse development/debugging and ostensibly faster page loads but realistically the same page loads.
Source maps are a thing, which has the added benefit of letting you ship the comments if you actually care to educate your users on how your website works.
It seems you've stirred up the minification cargo cult, who all seem to be ignoring the actual performance data vs compression . Thank you for the article!
Doesn't really matter after compression unless the bulk of your code can be culled from the end product. Tbh the largest business value this has is slowing down people trying to use your internal APIs.
It would be nice to have experimentally-derived numbers to point to in order to help people visualize whether or not minification actually improves compression to any significant degree.
Edit: something important to note: some frameworks (e.g. React) have lots of comments on their un-minified versions, that are removed when minified. That affects their size greatly.
> Edit: something important to note: some frameworks (e.g. React) have lots of comments on their un-minified versions, that are removed when minified. That affects their size greatly.
Nah, in a more serious note, to properly compare the impact of minification, I should remove the comments from the unminified (maxified?) version first. :)
Nah that wouldn't be fair, the code I serve my users initially has comments too and the arguments that people bring to criticize minification is that it adds complexity if you want to read the code in the browser. The comments should stay
Often I avoid comments by using long names for functions and variables (e.g. the test whose name is a statement of the postulate behind it) and minimization squashes many of those.
Compression is waaay more effective, so if you have to pick one, go with compressing. Is having readable source code worth the other 2KB? That's up to you. Source maps can do the same thing with less, though. Also, modern devtools have ways of de-minification (if you don't mind all the mangled variable names).
IMO, if minification is the difference between a slow and fast page load, it means you aren't utilizing compression. Minification is more obfuscation than anything, it doesn't really bring much to the compression table.
I don't understand; I'm not going to increase the size of my assets just to please the very small minority of hackers who want to play around. I'd rather open-source my code.
First of all, you're probably bundling your code, stripping comments, have tree-shaking eliminate parts of it. That's already a form of obfuscation. Then, minification will affect performance of parsing, it can even affect whether a function gets inlined by the JIT compiler, because character length is a very cheap heuristic. Whether I care about that or not is my judgement call.
The existence of interpreter opcodes implies you have already compiled the code for the interpreter, but with this heuristic you can send this function straight to JIT.
It's a large web app, sure, but gzip doesn't do 60x compression. 9-10x at best. But it's not a problem. It's for a small-ish group of people, and it's better to do the editing and some processing in the browser than on the server. The app uses relatively few components in that build at the same time, so it's not sluggish. There are also some "assets" in there. I don't like the practice, but my colleague does, so he occasionally puts things like images and fonts in there.
OTOH: it shows data. A normal size data set is 2000-3000x rows and 300-500 columns. If you output that as a simple table, the browsers gets super-slow. So another component has to be added. It adds up.
Sadly, not minifying source code can often lead to your idiot customer discovering that their web app is not a magic binary blob and complaining that it's "insecure! anyone can see how it works!", which then in turn leads to your idiot PM agreeing to a new "obfuscate all code" feature without bothering to involve anyone technical.
Deliberately obfuscated JS is far more evil, take it from someone who's had to debug 3rd-party obfuscated JS libs.
Minification always struck me as a cargo culting type way that "badass rockstar programmers would get every last ounce of performance" out of their code.
Because as noted below minification is much less impactful than compression, but going even further, if you really care about performance the solution is as simple as not shoving 20 metric boatloads of JavaScript in your website in the first place.
If you're argument I'd "muh rural customers" then maybe you should actually look at how your website is designed and ask yourself if you need to ship 3 different frameworks to accomplish what you need to do, or maybe start by stripping out the bloated analytics and ad nonsense in your code.
At the end of the day the author is right, the arguments about whether minification is useful is a teachincal argument, but the business will always want it for the purposes of obfuscating their code, and so we need to push back against the idea of it being an industry norm otherwise managers will push to make it so.
Soooo long gone are the days of the really compact website
There used to be a contest for best website that would fit into 5k bytes. [0]
Now, the bloat is so bad that there is a serious suggestion to nevermind the website, just make people download the entire cargo load and use it locally, merely to get acceptable performance.
If this does not tell us that the entire framework thing has gone too far, abstracted everything from actual speedy code, larded everything up with so much 'just in case' baggage code, etc... IDK what will.
Lots of folks outing themselves as never adding comments with remarks like "compression works nearly as well as minification".
In some of my more logically involved code, the bulk of the bytes are in comments - and they barely compress at all.
Just minify. Nobody cares about your source code; if they do and you want them to, publishing on GitHub is far better as it allows for viewing the TS/ESNext version and separate files too (don't tell me you aren't bundling!!); and working with minified code is not hard, especially now that dev tools have built-in maxifiers.
I publish to github too! It's nice to let people download my stuff and tinker though, straight from the website itself :) They can also tinker there and then in the browser.
I'm not trying to convince you either way – it's clear we have a different set of values.
In the modern era of minifying JS to squeeze out bytes, in certain cases, for smaller, non-commercial sites, it might be better for devs to learn from reading each other's source code.
Why is everything on the web has to serve some random tinkerer who do not benefit the creator of the website in any way? How about we apply this thinking to other kind of products? An app has to be designed for tinkerers to? A cars, a rockets?..., so everything has to be open source? Why?
I have my personal website on GitHub. It gives much more than "not minified code". And actually, I had one interesting email on a widget I created (weighted sorting of blog posts, so it is up to the user to weigh novelty, popularity, my opinion, etc).
I think minification should be done on another layer: HTTP compression and HTTP parallelization and that kind of stuff. And just don't make a complex web app, so it will be fast.
> just don't make a complex web app, so it will be fast
I never thought about that, I will just tell my customers that I removed all non-trivial features to make the app load faster. Also, there won't be any new features. I'll report back on how it goes.
which of course eliminates structures like "X is Y" but I realized I could still write stuff like "With utmost evil, evil webmasters minify their evil web sites for evil reasons that make sense only in the evil world view of evil people"
Minification is the "feel good" optimization for frontend (ahem, full-stack nowadays) developers that want to feel like they are doing FAANG-level optimizations.
For a small/medium site?
Optimize images sizes/resolutions & keep the JS dependencies in check to only actually used code, or optimize to get the most important data rendered ASAP are much more important then adding a minifier script in the release pipeline (and I'm assuming gzipping the content is basically done for free everywhere nowadays).
All of this especially in the spirit of TFA that encourages people to poke around JS code while browsing websites (as people did in the past with pure HTML)