I wonder. Should one ever use minified javascript code on a server? Assuming that you are using it on your own server and not distributing the code to clients.
Well, in theory yes. When determining whether a specific function can be inlined into its call site, V8 looks at the length of the function source code to try and guess whether it's worth it. Functions longer than 600 characters (including comments) cannot be inlined and therefore they will typically be slower.
Whether that makes any meaningful difference to your application performance really depends on the application. In most cases it won't.
That's really odd. In my eyes, source code length is not a good way to measure the semantic size of a function. But what you're saying is very interesting; I learned something new.
This slide deck has a nice example of using a comment to fix a performance bug in a js implementation of the chacha20 cipher: http://mrale.ph/talks/goto2015/#/14
Some libraries use annotations in comments. Therefore they cannot be stripped. Off the top of my head I know of some JavaScript testing libraries that do this.
V8 doesn't use an AST, it uses a CFG, but I believe it comes down to efficiency - it's far cheaper to look at the length of a string than to traverse a graph, and by its nature JS needs very fast compilation times.
This is probably one of those heuristics that works well enough on enough real world code, even though everyone knows it's suboptimal. I've heard that the turbofan compiler will remove this limitation but that's still very much work in progress.
Only reason I brought it up was the assertion that V8 didn't have an AST. Only the most trivial of compilers avoid building an AST; V8 may not use an AST for interpretation, but interpretation / compilation will be downstream of the AST and could use info from the AST for optimization heuristics. A non-IDE AST would not normally include comments.
I remember thinking that it was odd when I first read about it, I think it must have been in relation to one of the other compilers and I got my wires crossed, but I can't find the source now. Thanks for the correction anyway.
Almost certainly not, if only because it's another layer of indirection when debugging and for this use you're not limited by distribution size.
There is a potential benefit if the minifier can apply some performance optimisations, but one would hope V8 et al are already doing most of these and more.
Even for client-side applications, HTTP2 + Web Assembly will eradicate concatenating and minifying JS files soon.
OP here. Agree there probably isn't much benefit to minifying server-side code. However, I wouldn't be surprised if things like Closure Compiler were useful server-side.
Not convinced that HTTP2 will eradicate minifiers; it makes bundling files less useful, but minifying still gets rid of bytes. Then again, I'm not a web performance expert. :)
This may sound entirely wacky, but I have seen it make a major performance impact on nodejs.
The reason is that V8 uses heuristics to decide which functions get inlined, and the raw source code length is one of the heuristics. Making the source for a function shorter may cause V8 to inline it more aggressively.
Even if you don't use a minifier on the server, a library built using any kind of transpiler (babel, coffeescript, typescript...) could be susceptible to this kind of attack.
That's correct. I did not discover vulnerabilities in existing libraries or add backdoors to any of them. :)
The attack scenario described in the post is (1) attacker writes some plausible-looking patches to an existing library like jQuery, (2) attacker convinces library maintainer to merge the patches, (3) someone builds the library with a buggy minifier, which creates the actual backdoor.
This makes me think that there could be similar bugs in the browser, when it JIT-compiles or optimizes Javascript code. That could be used to take control of the whole browser/OS if used in an add-on/extension (given that it has sufficient privileges).
There have been similar bugs in browser JITs that allow websites to escape the sandbox - usually incorrect optimisations that cause the JIT to elide bounds checks when it can't safely do so, probably since those are the easiest to exploit.
There's actually a fair number of security researchers looking at browser-level exploits like those in JIT (c.f. pwn2own). My colleague Chris Rolhf co-authored a cool study on attacking JIT: https://www.nccgroup.trust/us/about-us/resources/jit/
Is there any benefit to it?