Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This article says "Wouldn’t it be nice if we could just make the parser faster? Unfortunately, while JS parsers have improved considerably, we are long past the point of diminishing returns."

I'm gobsmacked that parsing is such a major part of the JS startup time, compared to compiling and optimizing the code. Parsing isn't slow! Or at least it shouldn't be. How many MBs of Javascript is Facebook shipping?

Does anyone have a link to some measurements? Time spent parsing versus compilation?



I think the main issue with parsing is that you probably need to parse all JavaScript before you can start executing any of it. That might lead to a high delay before you can start running scripts.

Compiling and optimizing code can be slow, too, but JIT compilers don't optimize all code that's on a page. At least at first, the code gets interpreted, and only hot code paths are JIT compiled, probably in a background threads. That means that compiling/optimizing doesn't really add to the page load latency.

But I agree with you that this is a strange suggestion. If parsing is so slow, maybe browsers should be caching the parsed representation of javascript sources to speed up page loading, or even better: the bytecode/JIT-generated code.


> If parsing is so slow, maybe browsers should be caching the parsed representation of javascript sources to speed up page loading, or even better: the bytecode/JIT-generated code.

This is addressed in the article: https://yoric.github.io/post/binary-ast-newsletter-1/#improv...


Chrome already caches compile result for previously visited pages to bypass the initial parsing/compiling.


So does Firefox, btw.

This is addressed here: https://yoric.github.io/post/binary-ast-newsletter-1/#improv...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: