Hacker News new | past | comments | ask | show | jobs | submit login
How make your web-app become available faster: Let go of the DOM Ready events (thanpol.as)
81 points by thanasisp on Jan 10, 2013 | hide | past | favorite | 19 comments



As long as your JavaScript doesn't touch the DOM, there is no need to wait for DOM ready. BUT if you mess up and try and modify the DOM before it's ready, things break pretty horrifically. If your code is loaded async, you open yourself up to ugly race conditions that are tough to debug.

Start with the best practices (http://developer.yahoo.com/performance/rules.html). Once you've nailed those, then you can start thinking about optimizations like this.


hi josh,

as one of the people responsible for publishing those rules, I feel obligated to mention that they're somewhat out of date. The more correct rule today, given browser enhancements is, "It depends".

Asynchronously loaded code is obviously better. Stick it into the head, and allow it to download in parallel with the rest of the page. It doesn't need to execute until much later, but it's the downloading and parsing that takes the most time, so get that out of the way as soon as possible.

Within your script, you can check for the existence of the elements you need to manipulate. YUI does this quite well, but if you're using something else, just poll infrequently for the nextSibling of the node you want.


Philip, thanks for jumping in and well said. Looking at the state of the web right now where a typical site might have a couple dozen CSS/JS files, I think the rules are still well worth evangelizing. We still have a long way to go in helping people understand the fundamental principles, even if the exact implementations change over time.


An important caveat to the "load scripts at the end of the document technique": The visual elements of a page will render before the functional components (i.e., JavaScript) are available.

Imagine a site the sets a custom `submit` handler for a `<form>` tag, possibly for client-side form validation, an AJAX file uploader, date selection, etc. A user submits the form before JavaScript has loaded completely and doesn't get the benefit of the JavaScript-enhanced experience. In local development or on a high-speed connection this will probably never happen. But a user accessing the site with their phone over the slow U.S. cell data network has a much higher chance of hitting upon this undesired behavior.


Would it be acceptable to set the form's submit button to disabled in the HTML, attaching the custom event handler and enabling the submit button in a document-final script? Seems like you still get reasonable behavior in the low-bandwidth environment -- maybe even better, since the user can now access the other elements of the form while the script is being loaded and run.


That's an accessibility issue for users with JavaScript disabled (or where JavaScript fails to load).

On the other hand, it's also the recommended method to protect your users from a click-jacking attack across all browsers (read those that don't support CORS).


"However when a script is loading, the browser will not start any other downloads, even on different hostnames!"

Orly? I guess he missed the async attribute (first one listed) in the MDN documentation he sited.

"As per the HTTP/1.1 spec browsers can download no more than two components in parallel per hostname."

Fortunately, browsers ignore the spec here and make more than two connections at a time per hostname.

"Faster page rendering, faster time when page becomes usable, faster page loading, better user experience. It’s time to let go of the DOM Ready Event."

In my experience, getting things loading in parallel is one of the best ways to speed up page load times, and a DOM ready event can really work out in your favor when you set the async attribute on a script. Each situation is different, putting all the script elements at the bottom of body and forgetting about DOM Ready events may work best many cases, but making blanket statements isn't really helping anyone. I wonder how setting the defer attribute on both scripts and moving them to the head would work out in the author's benchmark.


...so why don't you try it? You can fork and perform tests...

You are right about what the spec sais and what happens in reality as far as "loading" is concerned. Modern browsers can open multiple requests to a webserver. However "Painting" and "Rendering" is another area which most developers miss its significance.

Script parsing is blocking to rendering.

Especially when scripts are included in the HEAD the delay becomes pretty apparent.

Check those slides out: https://perf-metrics-velocity2012.appspot.com/#16 (move left/right with cursor keys).

The point of the article was to illustrate how DOM Ready is not needed when loading and executing scripts synchronously. I am not convinced that loading scripts asynchronously will result in a faster, absolute time of when the page is ready to be used by the user (events binded, complex ui that need JS are rendered.

And i am definitely not sold on the concept of polling every 5ms for the existence of elements in the DOM. Which is required for async loaded scripts if they want to be as fast as possible (and hoping that they were loaded before the elements are rendered). If I had that much a burden of manipulating an element right there and then when it was rendered, i'd plainly add a SCRIPT tag right beneath it.


So as a summary, Javascript loaded (except for loaded via document.write) can immediately access the DOM if the Javascript is loaded in the bottom of the body-element making the use of a ready-event like jQuery's ready-event obsolete. I have always wondered if that really is the case but haven't tried it out myself, seems that it's a green light then. :-)


With one caveat: all your code that deals with positions and dimensions now needs to be smarter and handle the pushing around that comes during the loading of CSS and images. This will probably break the jQuery plugins that you're using if they deal with sizes and dimensions.


This would require some discipline in a shop using this. For one thing, one would want to have a convention where write() is not used or perhaps is disabled until the operation is safe. Also, this may be a form of optimization which is applied based on profiling. An app would benefit most from this on the pages most frequently viewed the earliest. ROI would increase for mature code which is changing less frequently.


I wouldn't ever think of `document.write` as 'safe.' If it happens to be called after the DOM is ready, you'll purge the DOM and blank the page. document.write can be useful for dynamically inlining new <script> tags in a way that older versions of IE are kosher with (which is why it's common in advertising), but in almost every other case, using the W3C DOM methods (or innerHTML) is a better decision.


Like dynamic web apps aren't already flakey enough.


worth reading in relation to this, including the impact of the CSSOM http://calendar.perfplanet.com/2012/deciphering-the-critical... and using the DOMContentLoaded event


"Marking scripts with “defer” and “async” makes an implicit promise to the document parser that you will not use doc.write, which in turn allows it to unblock DOM construction."

Would it be reasonable to ask for an opt-in feature to make document.write usage explicitly prevented, perhaps through an exception?



It should be noted that the event used in that, DOMNodeInserted, is deprecated: http://www.w3.org/TR/DOM-Level-3-Events/#event-type-DOMNodeI...


Ah, good point!


Let go of JavaScript.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: