Write some JS to delay loading the adverts until after all of the other content on your page(s) has been loaded and displayed. JQuery (especially) makes it easy to delay this kind of stuff and do the <div> injections well after your page is visible in the client browser.
I coded one two (three?) years ago for a contract, and it worked like a charm. Used it for tracking pixels, Google tracking, and a bunch of other stuff.
Surprised this kind of snippet isn't readily available on the intertubes.
Excellent! Haven't browsed all the links yet, but I didn't see any examples for injecting anything except trackers. Templates for some of the more common scripts like AddThis, and a few advertising examples, would be nice. Not that it's any harder, but some diversity would certainly help your cause.
Let me know if you'd like a few of my more devious snippets to add to your project, like overloading document.write (surprising how many of those damned advertising scripts still use document.write instead of div injection! :-\).
I dimly recall reading something along those lines that came from Amazon, but I think it was a side remark, not a published study.
My suggestion was about plotting for content/advertiser networks the lag for their Javascript against the income the site derives from the network. This is quite likely to be somewhat sensitive information, at least to a degree, so sharing it would be generous.
I was really (really) hesitant to begin blocking adverts. I believe in paying for what I use. What finally pushed me over the edge was "good" sites unknowingly distributing malware via adverts.
I wish that Safari on iPad had a plugin functionality for things like adblock and noscript. Sometimes web browsing can be excruciatingly slow from all the scripts and ads on websites.
Luckily there is an option to turn off JavaScript but there needs to be more control like whitelisting.
Safari doesn't have any selective blocking capabilities but there's always 3rd party browsers for that. Check out Ghostery for iPad which is focused on privacy & blocking 3rd party trackers:
I have a small number of sites blocked in /etc/hosts but a little searching found this hosts.txt file that seems to be updated pretty regularly. Gonna give it a try.
I'll see how good I am about updating this: there seem to be 2-4 changes most weeks.
Postscript - I've emailed Dan Pollock to see if he thinks this git repo is worthwhile. There is already a version of the file on Github at https://gist.github.com/399642, but it is not updated and is a year and a half old.
So could I get a little more detail here about why git is important? Are you making a init script to pull the file directly into /etc/hosts at boot (or at an interval)?
awesome, I saw this original article the other day, and thought it'd be awesome if it was on github (didn't think to, as it wasn't my work..) I was curious about how often it was updated, seems i've got an answer.
Just out of curiosity, how does one go about figuring out what sites to block and keeping this up to date and relevant? That would seem like a lot of effort?
I added a few things to my hosts file for ad-blocking (doubleclick, etc), and Chrome has since started behaving oddly... The back button reloading the same page and adding another history element, etc.
Maybe due to javascript trying to call things that don't exist? Not sure. Just be aware that hosts blocking can cause odd page results on occasion, because of content fetch failures (as apposed to noscript/adblock which often stop the request from even happening).
Ironically the site with the article itself pops up 13 items on my Ghostery, i think that's the most i've ever seen while browsing. Beats techcrunch's 12 trackers.
I've seen one AddThis call replace half a dozen individual Javascript/IFrames for all the different social networks. Seems like that would generally be an improvement.
Merely mapping googleads.g.doubleclick.net to 127.0.0.1 made a huge difference to my browsing experience.
Despite the resources of Google, that server is waay overloaded. I was staring at the browser status line waiting for that server to cough up, so the page could render correctly (or at all).
Supposedly, there are ways to embed doubleclick ads that do not slow down page rendering. However, most of the sites I visit haven't mastered the technique.
The article mentions that Google will rank you page lower if your site is "slow," which the article says can happen if you install these slow widgets and trackers. Don't forget that Google is not running any javascript, flash or loading images when it requests your page, so they're not affected by these trackers. What Google means by "slow" is how fast your server can return an HTML response.
I don't think so. In the Webmasters Tools you can see that Google has metrics for "Site performance" [1] and for "Time spent downloading a page (in milliseconds)".
Also, you can see that Google is running JS and loading flash by looking at the Image Preview that appears when you search.
[1] "This page shows you performance statistics of your site. You can use this information to improve the speed of your site and create a faster experience for your users" http://i.imgur.com/Jfz8O.png
So we're basically both right, but I was wrong in my initial assessment. Their crawler doesn't give them the page speed metrics, they get it from their toolbar data:
Page load time is the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser. It is collected directly from users who have installed the Google Toolbar and have enabled the optional PageRank feature.
Unfortunately, some of the better-paying CPM ad networks have the slowest javascript ads the world has ever seen.
We're basically trying to see if removing ads improves our page load speed and therefore our SERP enough to somewhat offset the lost income.