Hacker News new | past | comments | ask | show | jobs | submit login
This Is Why Your Website Is Slow (technologyreview.com)
73 points by evo_9 on Nov 29, 2011 | hide | past | favorite | 39 comments



We're running an experiment on our site, we've disabled ads entirely and we've seen an improvement in page load speeds across the board.

Unfortunately, some of the better-paying CPM ad networks have the slowest javascript ads the world has ever seen.

We're basically trying to see if removing ads improves our page load speed and therefore our SERP enough to somewhat offset the lost income.


Write some JS to delay loading the adverts until after all of the other content on your page(s) has been loaded and displayed. JQuery (especially) makes it easy to delay this kind of stuff and do the <div> injections well after your page is visible in the client browser.

I coded one two (three?) years ago for a contract, and it worked like a charm. Used it for tracking pixels, Google tracking, and a bunch of other stuff.

Surprised this kind of snippet isn't readily available on the intertubes.



Excellent! Haven't browsed all the links yet, but I didn't see any examples for injecting anything except trackers. Templates for some of the more common scripts like AddThis, and a few advertising examples, would be nice. Not that it's any harder, but some diversity would certainly help your cause.

Let me know if you'd like a few of my more devious snippets to add to your project, like overloading document.write (surprising how many of those damned advertising scripts still use document.write instead of div injection! :-\).


Sorry, didn't mean to imply it was my project. It's just one I'm using for a site with a lot of trackers (some clients do go hog-wild on those).


A plot of lag against income would be interesting. Is it the kind of thing you'd be interested in sharing?


I was reading recently that Amazon A/B tests showed that increased load time in orders of milliseconds were significantly affecting sales.


here is a good entry point for this research

http://perspectives.mvdirona.com/2009/10/31/TheCostOfLatency...


I dimly recall reading something along those lines that came from Amazon, but I think it was a side remark, not a published study.

My suggestion was about plotting for content/advertiser networks the lag for their Javascript against the income the site derives from the network. This is quite likely to be somewhat sensitive information, at least to a degree, so sharing it would be generous.


We iFramed all the ads. Problem solved. Ads load on separate pages. We're no longer affected by broken ads.


adblock + noscript ftw.

I was really (really) hesitant to begin blocking adverts. I believe in paying for what I use. What finally pushed me over the edge was "good" sites unknowingly distributing malware via adverts.


I wish that Safari on iPad had a plugin functionality for things like adblock and noscript. Sometimes web browsing can be excruciatingly slow from all the scripts and ads on websites.

Luckily there is an option to turn off JavaScript but there needs to be more control like whitelisting.


Safari doesn't have any selective blocking capabilities but there's always 3rd party browsers for that. Check out Ghostery for iPad which is focused on privacy & blocking 3rd party trackers:

http://purplebox.ghostery.com/?p=1016022066


I have a small number of sites blocked in /etc/hosts but a little searching found this hosts.txt file that seems to be updated pretty regularly. Gonna give it a try.

http://someonewhocares.org/hosts/


Wish this were in a git repository.


Your wish is my command:

https://github.com/chalst/pollockhostsfork

I'll see how good I am about updating this: there seem to be 2-4 changes most weeks.

Postscript - I've emailed Dan Pollock to see if he thinks this git repo is worthwhile. There is already a version of the file on Github at https://gist.github.com/399642, but it is not updated and is a year and a half old.


That's awesome.

So could I get a little more detail here about why git is important? Are you making a init script to pull the file directly into /etc/hosts at boot (or at an interval)?


Well, it's a little bit of work.

The point is that several people can maintain their own branches of /etc/hosts and use git to keep them up to date and propagate their additions.

I'd just clobber /etc/hosts from the repo file with cp -f, whenever the repo is changed, no need for any cron/init automation.


awesome, I saw this original article the other day, and thought it'd be awesome if it was on github (didn't think to, as it wasn't my work..) I was curious about how often it was updated, seems i've got an answer.

Just out of curiosity, how does one go about figuring out what sites to block and keeping this up to date and relevant? That would seem like a lot of effort?


I added a few things to my hosts file for ad-blocking (doubleclick, etc), and Chrome has since started behaving oddly... The back button reloading the same page and adding another history element, etc.

Maybe due to javascript trying to call things that don't exist? Not sure. Just be aware that hosts blocking can cause odd page results on occasion, because of content fetch failures (as apposed to noscript/adblock which often stop the request from even happening).


Ironically the site with the article itself pops up 13 items on my Ghostery, i think that's the most i've ever seen while browsing. Beats techcrunch's 12 trackers.




It is ironic, but it is not "ironic", whatever "ironic" means when Zeldman puts quotes around the word.


AddThis, ShareThis etc are also popular culprits here.

I'm sure that this is choir-preaching at its finest, but using the non-async AddThis plugin should be avoided at all costs.


shrug

I've seen one AddThis call replace half a dozen individual Javascript/IFrames for all the different social networks. Seems like that would generally be an improvement.


You paste code from a company with the words "media", "brand", "share" or "ad" in its name, you get what you deserve.


Merely mapping googleads.g.doubleclick.net to 127.0.0.1 made a huge difference to my browsing experience.

Despite the resources of Google, that server is waay overloaded. I was staring at the browser status line waiting for that server to cough up, so the page could render correctly (or at all).

Supposedly, there are ways to embed doubleclick ads that do not slow down page rendering. However, most of the sites I visit haven't mastered the technique.


The article mentions that Google will rank you page lower if your site is "slow," which the article says can happen if you install these slow widgets and trackers. Don't forget that Google is not running any javascript, flash or loading images when it requests your page, so they're not affected by these trackers. What Google means by "slow" is how fast your server can return an HTML response.


I don't think so. In the Webmasters Tools you can see that Google has metrics for "Site performance" [1] and for "Time spent downloading a page (in milliseconds)".

Also, you can see that Google is running JS and loading flash by looking at the Image Preview that appears when you search.

[1] "This page shows you performance statistics of your site. You can use this information to improve the speed of your site and create a faster experience for your users" http://i.imgur.com/Jfz8O.png


So we're basically both right, but I was wrong in my initial assessment. Their crawler doesn't give them the page speed metrics, they get it from their toolbar data:

Page load time is the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser. It is collected directly from users who have installed the Google Toolbar and have enabled the optional PageRank feature.

http://www.google.com/support/webmasters/bin/answer.py?answe...


Give disabling Javascript a try. Uninstall Flash. Most of the web will work and it will be fast and much less annoying.


Noscript and flashblock are a more functional approach I think. Meaning, load js and flash when and from where you choose to.


I read the title and immediately heard John Goodman say "This is what happens, Larry. This is why your website is slow, Larry."


I've never used any of those, but the facebook forum plugin routinely takes > 2 seconds to load!


Ironic that Ghostery pops up to warn me about 16 trackers on the article's page.


Ghostery found 14 on this page itself.


So, how many of these services does HN use? :P

hides


From the speed of the site, I'm guessing none.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: