Hacker News new | past | comments | ask | show | jobs | submit login
Native equivalents of jQuery functions (leebrimelow.com)
149 points by ingve on May 20, 2013 | hide | past | favorite | 93 comments



"But you should always choose to use native DOM methods if they are available to you."

...when you're writing code that has to run at 60fps. The rest of the time, you categorically should always be using jQuery.

The native equivalent is very likely to be what jQuery is using under the bonnet, so jQuery is essentially just a wrapper most of the time. The performance hit from that wrapper is pretty close to zero. It's certainly nothing to worry about unless you're writing a game. The difference comes when jQuery isn't just a wrapper - likely for things where there are browser differences or performance improvements by doing things a different way. The majority of developers don't need to concern themselves with that sort of thing. We can just use jQuery and leave the performance 'hacks' to the jQuery team.

Additionally to that, mixing native and jQuery code makes things trickier to manage, especially if you manage a team of developers who don't all work the same way all the time. "Just use jQuery" is a good rule-of-thumb.

And lastly, jQuery has a vast pool of talent keeping up with browser tech better than you. Even the most avid follower of JS updates can't compete with a team of the standard jQuery draws upon. When a new method gets rolled in to the core of jQuery, something that speeds up selectors for example, your code improves without you doing any work. That'd a massive benefit.

(How much of a jQuery fanboy am I?!)


jQuery has a considerable performance cost for the abstractions it provides. It doesn't just call the native methods under the hood, it has to provide additional abstractions like chaining by wrapping every returned object into a jQuery object. In a lot of cases these abstractions aren't helpful because you need to understand exactly what jQuery is doing. If you compare the cost of native methods over what jQuery does you easily add 500 lines of javascript code on every method call.

Additionally, jQuery may not provide the abstractions you need or want. If you are using a high level framework that manages templating I would argue that you should use the abstractions designed by the framework and not the one-size-fits all ones that jQuery provides. There is a lot of functionality that jQuery provides that may not be relevant at all for you. jQuery animations are one example.

Using jQuery doesn't stop you from running into browser bugs. jQuery could help you out with basic things like selection and event handling but modern browsers are already much improved in this area.

I very much dislike the spaghetti code that over reliance on jQuery plugins results in. You have pieces of code that rely entirely on jQuery and can't be refactored out or reasoned with clearly. It is terribly unidiomatic in any case.

EDIT: That being said jQuery does have its place.


> jQuery has a considerable performance cost for the abstractions it provides ... add 500 lines of javascript code on every method call

But how much does this really matter, with modern high performance JS engines? The simple (and somewhat dismissive, I'll grant you) answer is of course it comes at performance cost, and it may be considerable. Comparatively. After implementing using jQuery, if some component is too slow, optimize. Rewrite using underlying core JS if you have to. Or better yet, reexamine what you are attempting and see if there's a better way.

> I very much dislike the spaghetti code that over reliance on jQuery plugins results in. You have pieces of code that rely entirely on jQuery and can't be refactored out or reasoned with clearly. It is terribly unidiomatic in any case.

Can you provide some examples of what you mean by this? Especially about it being unidiomatic (programming language idioms are a topic dear to my heart :)?

Personally, I find jQuery allows me to structure my javascript in a more coherent way, but that could have been greatly influenced by my relative experience with javascript before starting to use jQuery.


Search any jQuery plugin, many insert DOM and listeners that are not slightly configurable, most of them don't follow simple rules: http://coding.smashingmagazine.com/2011/10/11/essential-jque...


I'm with you in that argument, and here is some code to prove it; a little script that orders all 'divs' based on their size; the jQuery implementation freezes the page for a few seconds while the native one does not (Chrome 26 - i7/2.67GHz - W7 64 bits);

    $("div").sort(function(a, b){
        return $(a).width() * $(a).height() - ($(b).width() * $(b).height())
    });

    [].slice.call(document.getElementsByTagName('div')).sort(function(a, b){
        return a.clientWidth * a.clientHeight - (b.clientWidth * b.clientHeight)
    });


Since those two things aren't really the same, I'd say this is a terrible example (a strawman even). If you want the clientWidth, use the clientWidth. jQuery works in tandem with the native DOM, and is not meant to be a wholesale replacement:

    $("div").sort(function(a, b) {
      return a.clientWidth * a.clientHeight - (b.clientWidth * b.clientHeight);
    });
The speed for this is slower, as expected, but comparable to native. http://jsperf.com/d97b341f-cfc1-4057-bdc9-60e80adb5cf6/4


In that context any kind of comparision would be a strawman unless is _your_ way of mixing jQuery with native code; maybe this would become more clear with different examples:

#Ex. 1

    $("div").map(function(){ return $(this).css('background-image'); })
"If you just want the background-image just use the get-computed-style-background-image!"

    [].slice.call(documents.getElementsByTagName('div')).map(function(a){ 
    	return getComputedStyle(a)["background-image"];
    });
#Ex. 2

    $("div").sort(function(a, b){
    	return $(a).find('*').length - $(b).find('*').length;
    });
"If you just want an algorithm that sorts the ones with the biggest amount of children and grandchildren just use an algorithm that sorts the ones with the biggest amount of children using domElement.getElementsByTagName"

    [].slice.call(documents.getElementsByTagName('div')).sort(function(a){ 
    	a.getElementsByTagName('*').length - b.getElementsByTagName('*').length;
    });

>And is not meant to be a wholesale replacement

I never said such thing neither; now that is 100% a straw-man.


Are there speed gains to be had by wrapping the a,b elements in the jQuery example only once instead of twice?

  $("div").sort(function(a, b) {
       var $a = $(a);
       var $b = $(b);
       return $a.width() * $a.height() - ($b.width() * $b.height());
  });



There is however a significant performance improvement to be had by optimising the inner loop, which is what any good programmer would do first here:

http://jsperf.com/d97b341f-cfc1-4057-bdc9-60e80adb5cf6/2


This optimization is actually replacing jQuery in the inner loop with native calls. It was my impression that the OP suggested that this could be performed more efficiently using jQuery.


Written sanely, it's only marginally slower; it's so close that it's inconsequential for 99% of uses.


You still have to download 81Kb of code whereas native is just that, native ;-) It's a trade-off. Depends which browser uses your target/customer.


This isn't entirely fair, as clientWidth isn't really the same thing as $().width().


Can't you just use jQuery.fn.sort if you don't want the chaining and wrapper overheard that the parent mentions? And if you do, won't you benefit from native methods or polyfills with the same API?

Here is what jQuery.fn.sort looks like in the console of Chrome. It appears to be referencing the native sort:

    >jQuery.fn.sort.toString()
    "function sort() { [native code] }"


It is referencing the native Array.prototype.sort; I just wanted to make a native-only way.

Yes, you can use other native methods and polyfills with the same API, for example here is a jQuery plugin for reversing elements:

    jQuery.fn.reverse = [].reverse;


This. It's very fashionable to be "NoJQuery" these days, but I just don't have the time to worry that there may be a particular quirk for getting, say, the selected value of a dropdown list for a particular version of Safari during a full moon.


Not only games. Animations, dragging, building DOM - all are critcal situatons where every line counts, all benefit for close to metal code, every ms adds up.


> every ms adds up

Especially when any bloat is multiplied times however many visitors come to your site. Presenting an efficient front-end to your users is the polite thing to do, as it saves them both compute time and personal time. A lot of efficiency can be gained with tricks like removing dependencies, minimization through gzip, and ahead-of-time compiling.


>Especially when any bloat is multiplied times however many visitors come to your site

Why would you multiply those?


Because number of milliseconds wasted on each visit * number of visits = total number of milliseconds wasted.


Yes, but no one person is experiencing that, so it doesn't really make a difference for user experience.


> The performance hit from that wrapper is pretty close to zero

Unfortunately, that's rarely the case in my experience. But I'd be interested in any hard numbers you have here.

https://news.ycombinator.com/item?id=2261211 and responses has some hard data from about two years ago, but it's possible that jQuery has improved a lot in the meantime, of course.

So _if_ you're at a point where you care about the performance of your DOM code, jQuery can end up as a significant bottleneck. Most people are not at that point, most of the time. But most people _are_ at that point some of the time. The trick is recognizing when they are and what to do then.


I disagree, use JavaScript since that what you're supposed to be doing, using jQuery for everything is stupid


Would you also suggest people not use frameworks, since using the core language is what they are supposed to be doing?

Additionally, I guess an ORM or query builder of any sort isn't worth it either.

I suspect for a large percentage of sites JS on the client may be faster than the language the site was implemented in (even at a pure language level, ignoring that you offload computing very efficiently).

Why is using abstractions that we all (well, many, if not most) believe to be of benefit on the server any different when run on the client, especially when it's more efficient (compared to server) and scales better?

Sure, if someone thinks they have a need for raw javascript performance but they haven't tested and confirmed this, maybe blind assumption isn't the best way to proceed.


Author makes some points that are fine, I guess. A dev needs to carefully consider his audience and his fellow developers. If I can use jQuery and know that I'm not going to have cross-browser DOM API issues, then let me have it. Because the method simply doesn't work in older browsers (<IE8 http://caniuse.com/#search=queryselectorall) It's not entirely fair to say that `$(".my-class li:first-child")` is equal to `document.querySelectorAll(".my-class li:first-child")` or that `$('.my-class')` is equivalent to `document.getElementsByClassName('my-class')` (doesn't work in <IE9 http://caniuse.com/#search=getelementsbyclassname).

If you're building games or working with a team who does, I'm going to guess you're working with fairly edge functionality in browsers where javascript support is great. If you're building sites for clients who are still (sigh) using IE7 internally, you just can't get away with something quite so simple.

As another commenter mentioned: "Even the most avid follower of JS updates can't compete with a team of the standard jQuery draws upon. When a new method gets rolled in to the core of jQuery, something that speeds up selectors for example, your code improves without you doing any work."

This is the same reason I use a library for preprocessing my CSS. I'd rather update a gem and trust a team whose focus is processing CSS than try to rely on myself for keeping up to date with the pace of the CSS WG and browser vendors.

Also, for anyone wanting to dig around the jQuery source, James Padolsey made a great tool for it. I know I was pretty amazed to find out just how much is going on under the hood. http://james.padolsey.com/jquery/#v=git&fn=


Your arguments about compatibility with old browsers is great, but sadly the jQuery team doesn't actually believe that the problem they are solving should include compatibility with those versions of IE: the latest versions of jQuery decided that IE9 is the oldest version they are going to support going forward, as they didn't want to deal with the overhead (both in code and in development effort) of being a library designed to solve the problem of cross-browser JavaScript development... in a world where jQuery is just aiming for "improved syntax for DOM manipulation, compatible only with relatively recent browsers" (something I will argue has always been the case, due to their rather early deprecation of Safari 1.x), those examples cannot be dismissed: jQuery is just being used to avoid the complexity and verbosity of native APIs.


That's not true. There are two latest versions, one of which supports older browsers.


It had sounded from the 2.0 announcement that they are only supporting 1.x for purposes of back-porting bug fixes. If the goal is to do some kind of long-term commitment to the mission of cross-browser compatibility, then forking off "2.0" seems pointless and even harmful. To verify, however: if this article then specifically was comparing jQuery 2.0 to native DOM, would you submit that iamjared's complaint would not apply? ;P


In fact there's even a 1.10 beta out now, so they are not dropping support for IE6/7/8 anytime soon. It's true jQuery 2.0 can't be used with old IE, but if that's what you need to support use the 1.x branch.


What I don't understand is, why did they have to keep <IE9 support in version 1.9 AND break compatibility by removing a bunch of functions as well as create a new version strand, 2.0, that drops support for <IE9. Why not jus break the API in version 2.0 only?


They've specifically mentioned a few times being able to detect browser version and load the appropriate version of jQuery. The 2.0 branch is supposed to get cleaner, faster code by virtue of dropping old workarounds (I imagine in some instances this may be quite a difference, as they may be able to refactor entire code paths if they can make new assumptions).

As such, they need to keep the API standard between versions. Unfortunately, they don't appear to have settled on the best API yet, so are constantly working to improve it.

I doubt the functions you are seeing removed[1] from 1.9 are specific to 1.9 being special or dual released with 2.0. They were more likely just removed because they were at the end of their deprecation cycle.

[1]: http://api.jquery.com/category/deprecated/


Question: if jQuery 2.0 only cares about modern browsers, can't they drastically improve performance by using these equivalent native functions behind the scenes?


They do all over the place. Still, there's an abstraction layer to work with in detecting these host methods and in providing a backwards-compatible implementation (polyfill).


What I'm curious about is, if these are equivalent, why doesn't JQuery just delegate to them in browsers where it is supported? Then you can get both the speed and the compatibility.


In the case of querySelectorAll, it does[1] (after performing some checks for easy wins) and if this test[2] is anything to go by, it can actually perform better in some cases.

[1] https://github.com/jquery/sizzle/blob/master/sizzle.js#L237

[2] http://jsperf.com/yui3-vs-jquery-selector-test-3/54


Personally I use jQuery Desconstructed when examining what's under the hood: http://www.keyframesandcode.com/resources/javascript/deconst...


Nicely done... right up until the end: "In that world if your game doesn’t run at 60 FPS, you might as well go work at Target."

Target made $69 billion dollars last year, and employs 365,000 people; I'd say there's no shame in working there.

If the cushy software dev life leaves you unemployed at some point, and you need a working-class job to pay bills while you hunt for something more to your liking, how will you spin your snarky attitude to your interviewer?


Nothing like insulting a third of a million people in one sentence (and really millions more that work jobs even worse than a typical Target job).


Saying that a job is crappy does not constitute insulting the people who have that job.


It was still said in a condescending fashion. Implying that if you cant make the 60fps club you should go work a lower level job.


My job is to create web applications, not to keep track of every little idiosyncrasy in how various browsers have implemented core DOM methods. The jQuery team do a great job at keeping track of that stuff, and ensuring that things are consistent between different browsers. If I need something to be absolutely-blazingly-uncompromisingly fast I can use native methods, but for most people's use-cases, the performance hit you get with the abstraction is not sufficient enough to go back to using flint and steel to start a fire, after we invented the windproof lighter that is jQuery (or $FRAMEWORK).


Especially setting newer CSS attributes is much better with jQuery. You'll miss some vendor prefixes for some browsers if you do it on your own.

  $('#foo').css('transform', 'rotate(5deg)');
No need to look up all the needed prefixes.


For the people saying "just use jQuery always, is not that much of a perfomance hit", is just blatantly false; any complex manipulation and you start seeing the difference right away; many times I have to use both, native selectors for performance but mixed with jQuery (for efficiency).

Also, there are many scenarios where you can safely use things like querySelectorAll like browser extensions, WebGL apps, 2D canvas apps, (native) WebCam apps, WebRTC apps and many more. Furthermore, at this point is arguable how useful is to support IE8 and below.

And this article falls very short of what can be achieved with the native implementation. For example thanks to JS being prototypical you can borrow almost any method from Array to your NodeList.

    [].filter.call(document.getElementsByTagName('a'), function(e){
        return e.innerText === 'Brackets';
    });


> Furthermore, at this point is arguable how useful is to support IE8 and below

I'm seeing this a lot, but IE8 is the highest "blue e that goes on the internet" on Windows XP.

Its a brave (consumer-facing) business that can dump 10% of its customers [1]

[1] http://gs.statcounter.com/#browser_version-ww-monthly-201305...


If you ask them to use another browser some of them will; so I argue is less than 10%. As a side-note "document.querySelectorAll" is supported by IE8, but unfortunately it doesn't have Array.prototype.filter nor other basic array methods.


[].filter.call() creates unnecessary Array instance, you should be using Array.prototype.filter.call() instead. Still, such code breaks encapsulation and is plain ugly. It would make much more sense if document.getElementsByTagName() was returning Array instance or if NodeList was inheriting from Array.


Technically not because JS is prototypical, but because of how ES5 defines `Array.prototype.filter`.


> Prototype-based programming is a style of object-oriented programming in which classes are not present, and behavior reuse (from Wikipedia)


I've seen a bunch of these noJQuery posts lately, but they've always focused on selectors. I'm curious what these people use for Ajax. Do they use the (very verbose) native syntax? Or are there any nice lighter weight libraries that separate out Ajax functionality into a readable abstraction like jQuery does?


So far, this has worked perfectly for me:

  function ajax(url, post_parameters, callback, data)
  {
  	var http = new XMLHttpRequest();
  	if (http != undefined)
  	{
  		http.open(post_parameters == '' ? 'GET' : 'POST', url, true);
  		http.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
  		http.onreadystatechange = function()
  		{
  			if (callback != null)
  			{
  				callback(this.readyState, this.status, this.responseText, data);
  				// readyStates: UNSENT = 0, OPENED = 1, HEADERS_RECEIVED = 2, LOADING = 3, DONE = 4;
  			}
  		};
  		http.send(post_parameters);
  		return true;
  	}
  	else
  	{
  		console.log("no ajax..  sorry.");
  		return false;
  	}
  }


Doesn't work in IE and doesn't support cross-domain requests. Oh, and yes, I'm sure every user knows to check their javascript console when an app does nothing.


I wonder if that code block does anything at all.

    var http = new XMLHttpRequest();
    if (http != undefined) {
        ...
    }
In what runtime would calling this constructor return undefined (or null, or false...), and yet not throw an error?


You're right, that code will blow up if the browser doesn't support XMLHttpRequest. The version below is more common:

    var ajaxObject;
    if (window.XMLHttpRequest) {
      ...
    } else if (window.ActiveXObject) {
      ...
    }
    
    if (ajaxObject) {
      ...
    } else {
      //handle users without ajax
    }
and that's just to create the object! compare to:

    $.ajax();


and that's just to create the object! compare to: $.ajax();

So you're comparing the body of one function to the signature of another? I know I posted some really shitty code, but you managed to shoot yourself in the foot even in that scenario, grats.

https://github.com/jquery/jquery/blob/master/src/ajax.js

Here, now it's apples to apples.


you realize i'm talking about code I have to write, right? If anything, you've further proven my point, considering that the jQuery implementation also covers other common use cases like failure, cross-browser requests, etc., i.e. hours upon hours of googling and/or just copy-pasting the jQuery source into my app, which adds loading time to my site instead of just using a cached version of jQuery that almost everyone already has in the cache in the first place.


When I look at that, I see man-years of work that I don't have to reinvent.


I already said it works perfectly for me - I don't care about IE (the feeling is mutual I think) and I don't request from other domains. The error message is useless/dumb, sure, but then again I didn't say you should use this, I said I use this. I'll improve it when I have actual need to do so, how's that for an idea.


Definitely a standard issue popularity backlash. See: PHP. Once something achieves overwhelming scale in tech, the majority turns on it because: dominance != cool.


I use PHP without any shame, I even think MySQL might be just fine for me; but I also like raw JavaScript way too much to ever have bothered to use jQuery even once.

That said, I have nothing against jQuery, but what used to annoy me was seeing Javascript questions being answered with jQuery snippets without anyone blinking, everywhere and all the time... you see, that's actually coming onto "my" turf and wantonly murdering there. That was too much, and if a "backlash" is needed to restore some sanity and balance, I say lash away.


I like the part where someone else is making assumptions about anyone who isn't a jquery fan, I disagree based on the physical reality of my own existance, and some anonymous coward decides nah, I don't get to speak for myself, the one-sided blanket assumptions someone else pulled out of their ass are really all the "contribution" this "discussion" needs... oh well, thanks for the belly laugh ^^


Are you thanking yourself for giving yourself a belly laugh?


No, whomever grayed my comment out because disagreeing in words would be too much like actual work.



Native APIs are the future, and everyone should learn to use them, but there are still too many gotchas and verbosity. Take a look at the code in rye.js[1], these are the most minimalistic wrappers to native APIs you'll ever get before ES6: https://github.com/ryejs/rye/blob/master/lib/manipulation.js

This is what I expected jQuery 2.0 would attempt, and the shape most browser libraries will take in the not-so-distant future.


Interesting 'innerHTML' is mentioned as a crappy native equivalent. In theory, I agree. But last time I compared innerHTML vs DOM fragments, innerHTML was much much faster across the browsers I tried (this was just before JITs for JS became common place). The tradeoff it seems is the browsers native HTML parser and serialization vs holding a DOM tree in JS, and for large sets at least, innerHTML won handily.


That has been my experience as well, in fact, I believe the native DocumentFragment code should be slower than the jQuery innerHTML-based code.


Is not crappy because is slower, is crappy because it destroys all the eventListeners and custom properties of all their childNodes.


It's crappy because it's not just slow, it manages to freeze Firefox 22 on a 2,8GHz CPU.

http://jsperf.com/leebrimelow-native-methods-jquery-dom


At least we agree is crappy. BTW sorry for ruining your perfs, I really wanted to test some few things.


Okay. Revision 9 adds code to remove the inserted elements, to put every test on equal footing. Is the practical equivalent yours? Because it hits a sweet spot.


The one called "MUCH MORE PRACTIVAL native equivalent" is a mistake, it doesn't do anything. I didn't know documentFragment lacks the innerHTML property so is not doing any parsing like the others.


Ah, that explains it. I can't submit Firefox results, though. The crappy native equivalent just locks up after throwing several "unresponsive script" warnings.


By using innerHTML you lose event bindings of all elements in the container you're appending data to, and the state of inputs.


innerHTML isn't W3C compliant. Why are you using it in the first place?



querySelectorAll is great, assuming you know your users aren't using IE 6 / 7. Cross browser support is really where jQuery shines.

jQuery is starting to become less relevant as newer browsers start to include the same functionality natively, and I say that as a massive fan of jQuery.


Why not use a polyfill to get the same functionality on older browsers? That also gives you the benefits of better performance on browsers that do support the native version.


jQuery uses getElementsByID, getElementsByClassName etc. when available in the browser. There is only a small overhead in parsing the selector to see if it's e.g. a simple ID or class selector.


Or should create a DOM element. I think the main pitfall of $() is that it tries to be too many things at once. Node creation and selection should be two separate slimmed down functions. There's no need (other than convenience) for selection and creation exist in the same function.


And onDocumentReady handling :)

You can do this:

    $(function() { /* handle onDocLoad */ })


Exactly, there's too many if this than do thats in $().


jQuery is a pleasure to use. It's just plain useful. If you need to manipulate the DOM in legacy IE, nothing compares. I'm reading a lot of complaints about the overhead for legacy IE support (presumably from people with better clients than mine...) but I'm surprised no one has mentioned Zepto. I haven't had a chance to use it yet, but isn't it just a stand in for jQuery where legacy support gets stripped out? I'd love to see the benchmarks.


No, that's jQuery 2.0. Also, Zepto is slower.


I'm curious - is there actually much overhead if your browser supports these native methods? Surely jQuery just goes straight to them?

For anyone who knows jQuery internals, does it check for browser compatibility on every call, or does it check once, and then hardwire up the correct methods for the next call?


Right from the source: "Sets document-related variables once based on the current document" [1]. It's a very readable document and the comments give good overview about quirks you have to be aware of.

My personal opinion: not using jQuery (or similar library) for normal webpages is premature optimization and it will hurt more than help.

[1] https://github.com/jquery/sizzle/blob/master/dist/sizzle.js#...


jQuery has lots of feature checks, a long dependency tree, and a half-decade of accumulated bug reports and corner-cases. It also implements custom selectors and "fixes" behaviors in querySelectorAll that often leads to a significant drop in performance. Not that you'll notice unless you're using the DOM for rendering complex graphics/UI :)


False, any complex manipulation (graphic-less or not) of all DOM elements in the current page an you notice a significant performance cost in orders of magnitude.


A lot of the checks are done on initialisation. Take a look at jQuery.support - http://api.jquery.com/jQuery.support/ - it's around line 1300 in http://code.jquery.com/jquery-1.9.1.js


jQuery is much much slower for many selectors, often orders of magnitude slower.


Great article. Newer developers are spoiled by jQuery and don't realize that all it does is make native javascript a bit cleaner. However, one nightmare javascript task that this article overlooks is ajax. Native javascript ajax is no where near as simple without jquery


Cool article. I'm a big fan of noQuery. One thing you could improve is regarding appending, using insertAdjacentHTML instead of constructing the DOM manually in a document fragment. It's significantly less code and should perform better.


Nice point.. i wasn't a ware that using the native selectors should give major performance impact. I will try to measure it and will see, Thanks




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: