Hacker News new | past | comments | ask | show | jobs | submit login

I wonder how many websites' users would be compromised if code.jquery.com got hacked.



I've been saying this for years. It's not even just security: you're also potentially leaking all your visitor stats to a third party (IP, user agent, the pages they visit (via the Referer)), and effectively giving jquery.com a third party supercookie over thousands of domains.

I surf with third party cookies and referers (via RefControl) disabled, and these should be the defaults


I've been called an idiot (even here on HN) for being paranoid about loading scripts from all over the web. I think it's a losing battle and my side is going the way of the dinosaurs.


As tools like uBlock and uMatrix get widespread adoption more and more people are realizing how much extraneous junk is being loaded by webpages. I think your side is slowly but surely gaining followers.


I use uMatrix, and I have already whitelisted loading scripts from common CDNs as global rules. They're everywhere and I found myself just constantly whitelisting them anyway.

The trend for a long time was to cite using a CDN as a best practice, but no one ever calls out the downsides when making such statements. In this case, you lose control of the code and allow third-party access to your users' browsers.


To be fair, it has some positives: like said CDN being able to patch somelib.js to fix a security issue and thereby protect thousands of sites at once.

At the moment though, proposed solutions to trusting third parties with your Javascript, like the W3C proposal to put cryptographic hashes in to your <script> tags etc, don't even consider these potential positives. So we're likely going to end up with the worst of both worlds.

If anything this is just enough facet of the weakness of the web as an app platform (Real solution: all sites serving client side libs should use package management, scripts should be digitally signed by the authors). As it stands it's far too common for people to just unzip WordPress or whatever in to their docroot, and so server-side code doesn't even get updated, let alone client-side code.


Which is an interesting problem as to stop loading all these scripts I have to give another script access to everything I see.


If you download the source for uBlock/uMatrix, and run locally you avoid the auto-updating of the add-on while being able to vet the source.


Thank you, that's an extremely good idea.


The same could be told about other third party resources, like images and css files, too.


Maybe, but hotlinking images has historically been considered rude at best, and theft at worst. And most people still host their own CSS because it tends to come with whatever app, or theme, they're using.

In any case, the defaults I live with cover all the other web resources too.



> you're also potentially leaking all your visitor stats

Not just potentially but (f)actually!


Could this be illegal? I mean the EU for example has pretty strict requirements on what you can do with user information.


Which is why they're adding Subresource integrity to script tags so you can detect when a popular CDN/host is hacked and not load the script: http://w3c.github.io/webappsec/specs/subresourceintegrity/#h...


How would that help? The "integrity" packet is provided by the now hacked server, and thus can never provide extra security? Or am I missing something?


The integrity value is part of the script tag. You add a hash of the script content as a property of the script tag, and the browser only executes if it matches.

I hacked together something that implements this sort of behavior in a script loader a while ago, if you're interested: https://github.com/ryancdotorg/VerifyJS


Ah I see, thanks for explaining.

So, though this does allow one to safely circumvent the hosting cost associated with bigger third party scripts, it means giving up some of the advantages like dynamic updates (as the hash would now be incorrect), right? This would therefore not work when ad providers want to be able to supply content they get dynamically from others right?


Many of the CDNs will let you reference a specific version of the script. If you didn't do this, and there were an update, the script wouldn't load and you'd have to update your site. My script allows callbacks to be specified for a bad hash, so you could be notified of this, and the subresource integrity draft also mentions this as a good idea.

It seems not uncommon for ad networks to dynamically load further scripts/content, which would not fall under the hash. You can just sandbox them off in an iframe, though.

An obvious extension would be signed scripts, which would re-enable trusted updates of a script in a CDN, but there is the question of how that would be implemented.


doesnt have to be literally "hacked", just a change of the dns records is enough.


Well, once you hotlink jquery in your website at least use the https link (assuming your site is on https too).


No need; the browser will block http:// included from https:// pages. Including from a http:// page? Then a compromised jquery cdn is the least of your worries.

In short; no, just a compromised dns record is not enough.


if it is a http:// page, then even a small change in the local host file is enough.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: