Hacker News new | past | comments | ask | show | jobs | submit login
The Verge's web sucks (lmorchard.com)
345 points by mbrubeck on July 24, 2015 | hide | past | favorite | 133 comments



> We keep things like Adblock Plus at arm's length for plausible deniability - but everyone I know uses it.

I wish this had been included in the main body of this post, with some metrics; everyone you know uses it because it makes the web suck so much less.

Loading The Verge's article in a Chrome incognito window: 19.6 MB transferred, finished in 41.9 s, huge ad covering the entire page above the developer console I had open to watch things load.

Same article in a new incognito window with Adblock Plus enabled: 1.5 MB transferred, finished in 16.94s.

Screenshots: https://twitter.com/jbscript/status/624535428620791808


Default-deny is the best answer against bloat -- though not for everybody unfortunately, but I did try to make it as straightforward as possible with uBlock Origin[1]. Using default-deny in uBlock Origin/Chromium[2] and with the page displaying properly, I get from Network pane in dev console:

- 62 requests

- 508 KB transferred

Once you start using default-deny mode, it's difficult to go back to anything less restrictive as one get quickly used to how fast pages load -- and with such virtuous "side-effects" as foiling most 3rd-party tracking/data mining by default. Once usual sites are "unbroken" and ruleset saved, using default-deny become less and less of a annoyance over time. For example, I already had the only two rules necessary to unbreak The Verge so the page appeared correctly the first visit.

Another example of ridiculous amount of bloat: https://github.com/gorhill/uBlock/wiki/Tips-and-tricks-water...

* * *

[1] sorry if this sounds like a plug, I do feel strongly about users re-claiming control about where their computer connects.

[2] using suggested default-deny ruleset at https://github.com/gorhill/uBlock/wiki/Dynamic-filtering:-de...


The difference in page load times before and after implementing uBlock is staggering. Having to use a computer you don't own (or mobile) feels like traveling back to 1999 using 56k dial-up.


Firefox mobile supports extensions like ublock just fine. Use a browser which doesn't suck :)

And if you're using an Iphone: you made the choice that apple should make all the choices for you. You've made your own bed. Sorry about that.


They already made that choice for iOS9.


What's worse is that there are still plenty of places that haven't actually progressed beyond 1999 using 56k dial-up. A lot of these multi-megabyte JavaScript-laden sites are basically a big "fuck you" to those in developing countries and even rural/underdeveloped portions of developed countries that don't even have so much as DSL, let alone cable broadband or fiber. For them, it's either 56k dialup or paying absurd amounts of money (in a multi-year contract, no less) for a satellite connection (which tends to have significant performance problems due to the higher latency).

Thankfully, this is starting to become less of a problem as cellular networks continue to expand, but there are still quite a few places where cell coverage is spotty at best (including here, in the Sierra Nevada mountains, where the mountains themselves cast all sorts of coverage shadows no matter how many towers get put up).


I have been using NoScript for years and I selectively whitelist enough domains until the site loads properly. Typically the actual domain, perhaps a web server & CDN. The raft of ancillary domains I just leave blocked.

Working at a major e-commerce company, I get I why they're there. Like many others, I've found the web much more... responsive... without the huge page loads, and I don't like the idea of how far my data travels around and is shared, corss-referenced, etc.

Great post.


> sorry if this sounds like a plug, I do feel strongly about users re-claiming control about where their computer connects.

Would you install an extension that, when clicking a link to the verge, it pops up a message that says "this site is considered by the community to be a site that employs intrusive advertising techniques" to remind you not to connect to it at all?


Totally. The web isn't going to change until there are hard consequences for those that are screwing it up, and without information such as this most users won't realize what is happening and bad websites continue to get away with it (and all the ill-gotten revenue).


Web of Trust does something similar, but seems to be geared more towards spam and malware. Would be nice to have something to keep track of all the sites that I forget not to go to (e.g. Forbes).


Raymond, how did Chris get to take over the original branch, and yours appears as a derivative work, when you are the originator? What the hell happened?

I want to switch to uBlock Origin, but I currently use Safari. It uses the least amount of resources on my Mac compared to Chrome and Firefox.


He transfered the repo then made a fork.


I guess that can be reduced a lot if not only adverts, but all the tracking scripts are blocked.

I just tested this with uMatrix where I use pretty much the defaults, but allow the two CDNs for theverge.com.

Ended up with 600kB downloaded and 1.37s load time on an empty cache.


I've replaced Adblock with uBlock and uMatrix and couldn't be happier. It seems to have a smaller footprint in the browser and better results at filtering away unwanted requests.

The only thing that would make this perfect is if they would also run on my Android phone.


uBlock Origin at least runs in Firefox on Android; I rarely use Chrome on it so can't speak to that.


i uninstalled chrome and facebook, replaced with ff and tinfoil. much better battery life. only google apps i have is play music, google and translate (im rooted)


I see a pattern. I may be revealing a bit too much about my longevity, but I've always hated two thing about 'reading' a magazine: the little subscription tear-offs that fall out littering every floor, and that you can't actually read and article without experiencing some form of ADD by flipping through pages of ads you don't care about to get to the next paragraph.

Sound familiar? I just described most web sites by recalling reading a 20th century vintage print magazine. That should sadden everyone. We have a triumph of human progress in not just the Internet, but the Web that rides above it. The promise is a more intelligent and enlightened interaction between content provider and content receiver, perhaps more like a conversation than the print-radio-TV one-way street. Instead, it's not really much different at all, except with more ads and popups.


Nilay Patel has the audacity to say that the mobile web sucks[0]. No. It's just your website that sucks, mate.

[0] - https://www.theverge.com/2015/7/20/9002721/the-mobile-web-su...


He blames mobile browsers for at least half the article (I had to stop reading). Sure, mobile browsers may not be great, but web sites these days seem to be written by people who just don't give a shit. Their hipster attitude is ruining people's experience.


I agree. I've wanted to write my own blog post about the things that suck about Mobile web, it's not the device or the software (typically), it's the way websites are designed (or not) for phones that sucks. You get advertisements that hijack all your screen real estate, open up the Android Market out of nowhere (why do they have this much priviledge? That shouldn't happen on the desktop without asking for permission). There's also the hell of being redirected to a websites "mobile" friendly site, only they redirect you to the home page, not the mobile version of the content you were trying to get to (seriously?) I mean there's so much. I usually have to ask for the desktop version of websites, and this is ignoring JavaScript that jacks up your resources and flash based content. Try triggering OnMouseOver on mobile as well. Lots of warts to browsing the internet on a Mobile phone.


> open up the Android Market out of nowhere (why do they have this much priviledge? That shouldn't happen on the desktop without asking for permission).

This is based on how Android browsers handle URLs to the Play Store. Some browsers (like Firefox) don't do this automatically (at least for me); instead, they'll just present you with the web page version of the Play Store (as if you were on a desktop), but with an icon in the address bar indicating that you can open the URL in the Play Store instead.

But yeah. The mobile web is a hellhole. My "favorite" is when a site will throw a browser dialog asking to install the app version of their site.

I feel like web developers nowadays are treating XKCDs #1174 [0] and #869 [1] as sincere, genuine advice instead of the scathing observations and condemnations they are of mobile web behavior.

[0]: https://xkcd.com/1174/

[1]: https://xkcd.com/869/


> My "favorite" is when a site will throw a browser dialog asking to install the app version of their site.

Yes! If I wanted the app I wouldn't be on a browser!


I hate how helpless I feel when using a phone. This website wants to open the App Store? Guess I have to let it. I was hoping this was at least a little better on the Android side of things, but maybe not. Sigh.


Mobile browsers are fine. The problem is that the A9 processors from a generation ago (and current low-end phones) have similar processing power to an old Pentium 4 processor (with less bandwidth and RAM). Put the same website behind something running an A57 and watch those intolerable numbers become reasonably acceptable for a >3W processor. Don't blame the browser, blame ARM.

Mobile processors are slow because Moore's Law is grinding to a halt. I think we should blame the process engineers, but the process engineers are being halted by the size of atoms and weird quantum mechanics issues.

Nilay, don't blame browsers, blame the universe.

(or alternately, blame your inefficient website).


It's not a question of CPU power, it's shittycrap designed websites that push everything to the browser.


I wrote an angry rant on the Verge forums in response to his article — http://www.theverge.com/2015/7/20/9003677/re-the-mobile-web-.... Perhaps the language was too pointed (the thread got shut down), but I just couldn’t get over his arrogance — “But we can't fix the performance of Mobile Safari” — when the performance problem is PROVABLY the publisher’s fault.


No, it isn't. The mobile web does suck, and I say this as someone who works making mobile web sites every day.

While Nilay's article is a bit hit and miss, it still seems pretty undeniable to me that the mobile web experience is inferior to native apps.


As someone who builds mobile sites every day I disagree, mobile web does not suck. It's not accurate to blame mobile web when someone attempts to make it work in a way it wasn't intended, as a native app replacement. It's not a tool's fault if someone insists on using it incorrectly. A hammer isn't bad for hammering nails because a wrench is a better choice, over the hammer, for removing bolts.


Yes, the mobile web is inferior, but the point is that websites have done the legwork to make it so. Facebook's instant articles are basically a web page where facebook forbids bloat. In fact, the publishers deliver it to facebook as html, and i suspect facebook renders it in a webview. So, why not just show the exact same content on their mobile web page, instead of something with ten times the bloat? The bloat has become gratuitous. Just look at facebook's page where they explain instant articles, itself a fine ironic demonstration of gratuitous bloat.

I don't think mobile safari can be blamed. When the iphone was first released, people thought safari on it was amazing. Today's safari is a much better browser, running on vastly superior hardware, but the browsing experience is inferior. Who else is to blame but the content providers?


Inferior in what ways, and for what purposes?

For the most part, I haven't seen a technical superiority between web & native for most of what folks want to do. It seems more a practical superiority because we haven't been allowed to / found a need to crap up the native experience quite as much.

App stores & in-app purchases go a lot way toward obviating the need for so much cruft. Too much JS framework cruft is a self-inflicted injury. These are addressable issues, not inherent to the web.


Wow. From the article, an example post on the Verge "downloaded 12MB - a little over 7MB in that is JavaScript" and a refresh of the page with things cached still downloaded 8MB again.

And key here -- the article HTML content itself was 75k, the rest is ad network Javascript. (apparently over 20 different companies)

I don't like tracking scripts either but why can't the ad networks get together and create a shared script instead of so many that seem to be redundant?


This is complicated because of how ad networks work and all the layers involved.

We have somewhat of a standard called RTB (real-time bidding) where an exchange will see each impression and run an auction in 100ms for each ad slot and then take the winning bid and render it on the page. That was easy and efficient when it was just an image. Now ad formats have become more "engaging" and require javascript tags to run (like those expanding ads). So now the exchange tag wont just load an image but load more JS tags to render the ad.

Then we have DMPs (data management platforms) that are solely about measuring your audience. Each one is a isolated silo of data and they dont share because they all have their own algorithms and backend connections with offline data sources to provide value. Most publishers and networks however have to work with all of them since ad buyers want access to everything or they wont spend.

Then we have the new 3rd layer of vendors who do ad fraud and viewability (measuring whether an ad is actually visible in the browser window) which is yet more tags to make sure the original ads are actually rendered and worth something. Then there are some ad networks that don't work through RTB exchanges that need their own tags on the page.

An additional tag isn't the problem, it's the fact that these networks are designed to just offload as much as possible to yet more providers which means a single 5kb tag can balloon into dozens of requests which all load their own crappy frameworks to do the same thing. 10 small tags on a page are fine. 10 small tags that load 25 tags each is what's killing these sites.

Disclaimer: I've built ad networks and run my own now.

EDIT: I wanted to add that part of what I'm pushing for in the industry is more regulation not only with business practices and data but also technical certification. Something to prove that an ad network is actually designed well and won't be a burden on sites/users. Then buyers can just trust that certification without asking for all these 3rd party vendor verification tags that slow down sites and increase cost. Any ideas for making this happen would be appreciated.


You might attack it from a non-technical perspective, where the "Decent Ads" certification says something about how their business practices are set up with SLAs or policies for who is at fault if an advertisement contains malware.

Alternately, a technical benchmarking-only system, where you try to identify how much their contribution is to page-load without respect to security/etc.


Yes the IAB (Internet Advertising Bureau) and MRC (media ratings council) and 4A's (American Association of Advertising Agencies) all have various "standards" and certifications. There are similar groups in Europe and smaller orgs that focus on niche areas.

Unfortunately for the most part these groups all charge lots of money for a nice seal you can put on your site, without any real checking of a business. Some of the most offending and insecure networks are members of the IAB. [1] Not to mention this has no bearing on actual technical implementations. The MRC does audits of companies to certify them for measuring viewability but there can be 50% or more discrepancy[2] between vendors which ultimately makes the entire thing pointless. And buyers also don't care, they have their own "favorite" vendors they want to work with so they're not even interchangeable.

The adblock plus company in Germany (while I dont agree with their entire stance) has an "acceptable ads" standard that at least lays out something in terms of user experience. However they accept money for whitelisting which is why all those crummy clickbait widgets like outbrain/taboola still show up through the default ABP filters.

What I want to see is actual legal regulation that covers how a business deals with data/privacy and then some kind of technical certification led by a real tech-savvy company to vet ad networks and their implementations. However this industry is full of political motives and other shady dealings so it takes a herculean effort to do anything... Maybe with some consumer backing there might be changes given the wide usage of adblock today but (not) surprisingly most digital ad execs see it as some kind of plague rather than the direct consequences of industry actions. That's not a very cooperative attitude for enacting change.

[1] http://www.iab.net/member_center/1521/1534 [2] http://adexchanger.com/data-driven-thinking/viewability-lets...


> So now the exchange tag wont just load an image but load more JS tags to render the ad.

Not to mention that more and more this javascript is spent calculating whether an ad is being viewed are not and phoning that data back home. Ad providers do this in different ways and it's not uncommon to see them send down a full jQuery build with their ad.


Yes, I mentioned all this in the other paragraphs.

Ad networks have a lot of work put into the backend systems but the frontend stuff that runs in the browser has been ignored for a long time. Even Google still uses things like Document.Write for putting more tags on the page, something that was obsolete about 6 years ago.


Thanks for all these details! My rant was largely based on guessing what's going on with all the stuff.


Of course. The more consumers know the better we can solve all this.

Honestly, I don't agree with the blocking all 3rd party cookies and adblock. It's a blunt-force approach that is harming more than helping right now, but I do agree with the reasons why it's happening and I'm hoping that it will finally change the situation.


> I don't like tracking scripts either but why can't the ad networks get together and create a shared script instead of so many that seem to be redundant?

I've asked that myself many times, and I mostly decided that part of the reason is at least partly related to why there were so many browser engines back in the day (Mozilla, IE, Webkit, Opera, etc). Each company thinks they can track something differently/better than another.

Even if ultimately it just lowers the bar for the industry.


Also, a lot of these products are part snake oil, so getting together to come up with shared tools would expose all that waste and redundancy and put half of them out of business..

Wait, that sounds great.


Why can't the site just collect some data from the client and forward it to the ad networks on the backend? All of this stuff needs to happen out of band.


Everybody is cooking their own soup and no-one wants to share their toys and sandbox with the other children.


Sharpers and snake oil salesmen peddling nonsenseware to terrified publishers who watched their entire business model crumble to sand in the face of craigslist.


Good question: we are getting there with the major DMPs (data management platforms) which are Krux, Exelate/Nielsen, BlueKai/Oracle, Adobe, Facebook and Google. There are dozens of smaller players but they now all hook into these major 6 so those are the tags you're seeing.

At this level, these are billion dollar companies so there won't be much more consolidation.


Because the trend is for the client to bear the burden.


It sounds like the Mozilla solution of building some kind of intelligence around advertising might be the way to go then. Injecting all this stuff in the page load is crazy.


Because the ad networks don't completely trust the publishers.


After reading this article, I decided to try reading the verge in a more "limited" browser which doesn't support endless Javascript and too may advanced features:

Emacs built in web-browser, or "eww" (that's its name). It's sort of like Lynx or w3m, except you can click things and it shows actual images!

You can see it here: http://imgur.com/FqJVB0U

You know what? The site and all content loads instantly. It may not be beautiful, but it works a hell of a lot better and is just so feather light it feels surreal.

Maybe I'll seriously start using eww for more stuff. This was surprisingly nice.


Links actually has a graphical mode too: http://paste.click/DdkwUx

There's even a driver to draw straight to a framebuffer. It can be useful if you're on a box without X or break X and need a browser (which is how I found out about it).

I don't actually use it unless I have to, but interesting to know.


A note under Ubuntu: you'll have to install links2 (instead of links), and you may have to change permissions on the fb device to allow your user to write to it, if you don't want to run a browser as root.


Wow. Nice.

Almost completely related to the discussion at hand: Interesting how much presentation and familiarity affects the cognitive load our mind has to deal with when seeing things.

It took me quite some time to realize that the screenshot you posted was of my own eww image-paste on imgur loaded inside a graphical links-session.

Have an up-vote good sir :)


There are some pretty esoteric options available for just about every operating system:

https://en.wikipedia.org/wiki/Comparison_of_lightweight_web_...


Integrating all those ad networks is unreasonably hard to do well, so it's done badly to save time. All ad-financed web sites have that problem to varying degrees.


That’s pretty cool. I’d love to try using a browser like this for a while, but to be honest, I’d probably feel too naked to be browsing the web with no tracking-blocking extensions. No JavaScript makes a huge difference of course, but I doubt that that is good enough to completely prevent all tracking methods advertisers use (I’m thinking of things like third-party elements on a page, although maybe most of those are also loaded through JavaScript).


> I’d love to try using a browser like this for a while, but to be honest, I’d probably feel too naked to be browsing the web with no tracking-blocking extensions.

ewww is emacs, so it's all elisp; it'd be easy enough to write your own tracking-blocker. As a first hack, just build a hash table of domains and add a little function which blocks fetching from any domain which is found in the hash table.


Hehe ... I had a similar experience but my solution was to just read it (as well as many other tech blogs) use feed readers. I almost never go to the actual website (The Verge, Mashable, Re/Code, RWW, Anandtech, Android Police, ...).


As a consultant, I only see web site producers making an effort to curb poor web practices when Google forces them to: Google's move to boost mobile-friendly design in the rankings has driven us a TON of responsive design work.

I wonder if that is going to be what it takes to fix today's bloat problems: Google takes the hammer to sites that too much advertising cruft. Otherwise, I don't see business makers seeing much of a reason to fix it on their own.


> I wonder if that is going to be what it takes to fix today's bloat problems: Google takes the hammer to sites that too much advertising cruft.

That could get interesting: Google, using the argument that all those ad-related scripts make things slow, ranking sites with a lot of them lower than sites which don't, and at the same time hitting their competitors hard: after all, Google makes money through ads, if their competitors are forced to be used less (by Google's own actions) more people might be using Google ad networks.


nah, that wont work.

that would surely end up in legal battle, from the ad agencies saying that Google used their monopolistic power on the search industry just to boost their analytics product.


IANAL, but I would think that as long as, algorithmically, Google ensures it's weighing all ads and trackers equally (read: whatever your concept of fairly is) they could prove that it truly is in everyone's benefit. In a similar vein to the aforementioned "Death to Small Businesses Day."


How about just counting the number of ad trackers?


It would be amusing if this made Google downrank their own crappy sites that use stupid quantities of JS to render mostly plain text - blogger, groups, plus, etc. are all a shitshow of unnecessarily bad performance.


I don't think any of this is appropriate for a search engine. I want a search engine to find me the best results for my query based on content, even if it's ad-riddled.


I agree 100%. I'm just observing that it is one of the few external sources of influence I see change organizational behavior. Business people FEAR google's impact on their success.


Especially with changing rankings based on whether it's mobile friendly or not. Marking it as mobile friendly is one thing, reducing its rank is another.


The Verge has some interesting long-form from time to time but it's subsidised by endless clickbait and empty 'conversation-starter' content.

Not to mention their recent move to close Comments being a cynical strategy to get people using their forums.

Realise that's OT but wanted to get it off my chest.


What's this about comments? Do their articles not have comments and a form at the bottom anymore?


Temporarily disabled by default but they turn them on for certain articles.

They claimed this was because comments had become too negative. It is possible they are genuinely just trying to protect the mental health of their staff who are usually the targets of more aggressive comments but somehow I doubt it.


Had to happen when they publish 'review' videos when they don't know what they are reviewing.

https://www.reddit.com/r/AndroidMasterRace/comments/37nw4p/i...


It seemed to happen after they posted the SpaceX article(www.theverge.com/2015/6/29/8863121/spacex-falcon-9-rocket-explosioauses) that was poorly thought through. They got some criticism(but not abusive) feedback in the comments.

They started deleting comments in that thread and it all went downhill from there once people realized posts were being removed.

Kinda a shame, I liked Topolsky & Co but I don't visit it nearly as much after that.


It had been brewing a while, in my opinion. What I was seeing happening was, reasonable people would point out flaws in the article, then a moderator would jump in and argue with the person like they were a child and their points were stupid, escalating the situation and probably now pissing off more people than just the original complainer. I stopped reading their comments because of the moderators. Moderators should be seen and not heard.


Deleting comments, locking threads created by reasonable people - in general, over-zealous mods suppressing the voices of users who don't toe the party line - is an unforgivable sin if you're trying to manage an online community, and The Verge is really terrible at this.


Topolsky has been gone from there for a year, so it's just Co.


Verge: We're turning comments off for a bit http://www.theverge.com/2015/7/6/8901115/were-turning-commen...


These guys (ie Verge) are going to have a really really tough time when iOS9 lands

https://developer.apple.com/library/prerelease/ios/releaseno...


I have a hard time seeing the majority of iOS users installing one, this is the same as with most things, if it's not the default, 95% of users won't make use of it.

Is even the majority of users of desktop web browsers actually using adblockers yet?

Adblock Plus on Firefox shows a bit short of 20 million users, while Adblock on Chrome says 'over 200 million downloads', whatever that means in number of users. These numbers don't suggest a majority. (I don't think assuming over 500 million desktop users is crazy.)

Given that iOS is about 20% of total mobile users, this feature will probably make little to no difference, it might be a couple of % of those 20% of users, at the very best.


You're probably right against the parent, but in the long term I would expect use of ad blockers to rise, simply because ads are getting observably more bloated and intrusive.

Elementary example: the likes of us will get requests from our tech-illiterate friends and relatives - "my internet is broken. It's really slow." And we will install ABP/ublock whatever.

This is, of course, assuming that there are no dramatic 'interruptions' - eg, browsers ceasing to support ABP-type extensions, the ads-versus-adblockers arms race moving decisively in the former's favour; etc.

I can't find any numbers on adblocker usage over time during the recent period - would be a good thing to look at - but here's the google trend, FWIW. https://www.google.com/trends/explore#q=ad%20blocker&cmpt=q&...


Thanks for the graph, that's quite interesting. Do you have any insight as to why it's risen so sharply during the past three years? I'm thinking maybe it's just connected to the increase in overall (especially smartphone) users of the internet? Even people who have been using desktops for years, use the internet more thanks to smartphones and thus ads become more pervasive in their life, thus perhaps prompting an increase as well?

Maybe you can share your thoughts on why you think it would be any indication that the overall percentage of internet users using ad blockers has gone up?


That certainly could be it - it's literally the only graph I could pull up at short notice. There are stats around but like browser usage etc they tend to come from this website or that. Based on some other numbers[0], trends seem to be more or less in line with what you'd expect: more blockers installed for visitors to tech-related sites; more installed on Firefox than IE, and Linux than Windows; etc. In that link, you're looking at nearly 10% of impressions blocked as of 3 years ago (which is obviously not the same thing as unique visitors, but I would be surprised if that was not higher than a few years before that).

I'm merely stating a hypothesis here, really - I have in mind how it was that people installed things like the google toolbar to block popups, and then browsers blocked popups by default, and then there were no more popups (OK, not quite, but still). The more irritating something is, the more likely a user will take steps to do something about it.

Time will tell.

[0] http://www.quora.com/What-is-the-percentage-of-Internet-user... - sorry for the hated Quora ink


They can just do what everybody else does, replace the site with "USE THE VERGE APP!!1!", and show the ads there instead. Which is a good reason for Apple to push adblocking in the browser.


This is Apple's goal. They want people to write apps and show ads through Apple's own ad network.


Hah, but they recently removed the Verge app from the android store to get people to use the mobile site instead.


This gives app makers the ability to throttle the ads show when using webViews? So FB, Twitter, etc. can render website content without ads? If so, this is a huge deal and needs to be a separate HN post.


This might not work in the webViews but will definitely work in the new Safari View Controller which will most probably become the default way for applications to display links.



I've seen folks give Apple a lot of credit for this, but they're not building an ad blocker.

They're offering the APIs that can be used by developers to build ad blocking extensions. Not much different than what you can do in Chrome & Firefox already. Same held-at-arms-length thing.


You talk like Apple is doing something new and owns the market.


I haven't seen other mobile browsers support blocking 3rd party content yet. I think that's mainly because Google is the other major phone manufacturer, and their main source of income is that exact 3rd party content. Apple isn't burdened by that - they want to sell phones and keep them the best. Blocking 7 MB of JS that isn't the content the user wants is a good optimization.


You're right about Google, but FWIW Firefox on Android fully supports extensions - including eg uBlock origin.


Android supports it in many different forms. If you're rooted you can block pretty much every ad in all apps, but even without root Firefox on Android supports adblocking extensions and there are alternative browsers focused on ad blocking and privacy.

Also to be fair you can already block ads on iOS if you jailbreak your phone.


No jailbreak needed.

https://www.weblockapp.com/


Ghostery has its own mobile browser on Android, blocking ads and trackers.


iOS also has a Ghostery app, but it has a less than optimal user experience. The UI is confusing. They bury essential options in the share icon, options that have nothing to do with sharing.

I'm not sure if the Android version is as bad as the iOS version.


Not necessarily. But ultimately right now the level of mobile ad blocking is very low. This will make a real difference to that - regardless as to whether it's new or innovative.


After I started working on my own RSS reader, I started converting more and more website to RSS feeds (e.g. twitter, youtube, all news site) where I only extract interesting bits. It's faster and cleaner, not to mension no ads.


More and more sites (twitter is a major offender here) are switching off their RSS feeds because they can't control the readers.

See for instance:

https://api.twitter.com/1/statuses/user_timeline.rss?screen_...


So I'm back to what I used to do 13 years ago - build my own little scrapers and adaptors to generate RSS feeds for myself :) WHEEEEEE!


Would you mind sharing the program?


There are plenty of readers, you just need to create the html to rss convertor for specific sites. My code is quite ugly so no. Also they often break when they make changes on website.


The converter is actually what I would really like to have. I hate parsing. And fixing it when it breaks. It doesn't matter if the code is ugly. You are trying to parse crappy websites after all. Consider sharing.


Well, I only have parsers for slovak news site, 3 parsers for lang-8 (homepage, lang match, footprints), couple for local real estate agents, nothing reusable for english speaking folks, only thing that you could use is probably twitter parser: http://pastebin.com/wpjZ6azZ I did it just recently and I use it to parse DAWN and New Horizont feeds.


The problem is shared between publishers and ad networks.

Sites are loading up on anything and everything to offset costs and it's only getting worse with adblock. And ad networks are just built with poor engineering and no attention paid to the user's experience. It's too easy to whip up a basic ad server and just load a dozen more tags on a page with the focus being volume and clicks. Unfortunately, there isn't an easy way to fix things because of the way money flows.

Disclaimer: I'm the founder of a digital ad network.


And it has for a long time (in terms of page performance):

http://www.theverge.com/2012/5/21/3034825/the-verge-page-per...


...it downloaded 12MB - a little over 7MB in that is JavaScript!

7MB was about the install size of Microsoft Windows 3.0, a complete (if crummy) OS.


> Believe it or not, the Content Services team at Mozilla is thinking about way more than just "plunking ads into Firefox". Like, what if we actually accepted the fact that ads are a way of funding the web at large, and browsers themselves offered built-in mechanisms to support advertising that respect privacy & performance? Yeah, that's a bit of a change from browsers' traditional neutrality. But, it could be a better deal for publishers and users together.

I'm curious as to how that could ever be done. I feel that it's almost impossible without somehow getting user information. I feel the trend is that ads are going to continue to be tuned to people and aspects about them. Maybe fully homomorphic encryption can do that without violating privacy but that's a long way off.

> Here's another idea: Almost a year ago, I heard the notion of "Subscribe 2 Web" at Mozilla. The gist is that you're worth about $6.20 per month across publishers via advertising revenues. What if you paid that much into an account yourself every month and used a mechanism built into your browser to distribute that money? Yeah, it's micropayments, but I find it interesting that these folks came up with a specific dollar amount that doesn't sound terrible.

It exists. It's called Contributor by Google: https://www.google.com/contributor/welcome/. If anybody needs an invite please let me know.


Ah, it's that site that's mostly known for posting inaccurate information about Android and closing off comments because people called them out on it. Why would anyone even go there anymore? They haven't published anything worth reading in ages.


I did the same thing several days ago. But after 30 seconds, the web page did not finish loading. It has downloaded 7+ MB. I closed it.


Some of these pages never "finish" loading. They use a tracker script that pings a server every so often with your current position in the story, so they can get stats on who actually reads the articles vs. who follows a link and then bails out.


Exactly! The worst (or best) part of having a slow mobile site is mobile platforms are, by and large, single-task platforms: you click on a link and then look at the screen until the page load.

If your website makes me stare at a blank screen for over 5 seconds that's it, the attention span for your low-content article that I was about to read out of boredom is gone. Close tab, click next link.

I guess it's a self-limiting problem in the end :)


That article gets a score of 15% on Google PageSpeed -

https://developers.google.com/speed/pagespeed/insights/?url=...


PageSpeed is... a mixed bag.

It likes to yell at me that my font size is too small, and tell me to "set a legible font size". When I'm not actually setting a base font size and instead adjusting things relative to the browser's default.

Their recommendation to "fix" this is to specify absolute font sizes, which is the opposite of basically every accessibility requirement I've ever read.

Similarly, they complain that I'm not "leveraging browser caching" -- but the un-cached resource is the Google Analytics JavaScript.

It also doesn't catch the one actual user-experience-problem bug that I know about and am working to fix.

So while it's enjoyable to poke fun at someone through PageSpeed results, I'd take it with an extremely large grain of salt.


Still loads faster than The Verge ;)


It's The Verge's article that gets 15%, OP gets 63% (faster than The Verge, obviously).


Huh. Interesting that mine is so bad. The PageSpeed report is fair, though. I didn't optimize images, set expires headers, etc.

Weird that Amazon isn't serving things up gzip'd - oh, interesting, I have to gzip the stuff in my build script and deploy it compressed.


"263 HTTP requests"

Seems reasonable. bwahahaha!

Shall we discuss the number of DNS requests?

And how many of those offsite servers are using something convoluted like Amazon for DNS? (which I find is more and more prevalent thanks to AWS)

The blog author, e.g., is using Amazon for DNS.

Alas, for each and every name, this adds more than a few lookups to what could be a 1-2 request process. Amazon uses multiple levels of indirection.

This dance is not of much consequence in the case of a single name.

But in the aggregate, e.g., many names requested from one overloaded site (such as one author singles out) after another, it does add up.

This also creates a larger margin for errors (failed lookups getting retried and timing out, again and again... while the user sits and waits).


Actually, FWIW, I'm using pairNIC.com for DNS. I do host on Amazon S3, though.


That means the full lookup for your personal domainname is for an Amazon S3 domainname. PairNIC just handles the first lookup to give the CNAME for blog.lmorchard.com, pointing to Amazon.

Then there are multiple other CNAME's that Amazon uses. It is about the same as looking up a Akamai customer domain. Lots of extra DNS queries.


At lot of replies to that Verge article seem to be addressing the click-bait title and not the content. The content is concerned with why an old Macbook (with comparable specs to a new iPhone) will handle the web much better than said iPhone.


I need to do more specific digging on this to have a good answer. From the bits I know, mobile browsers offer more conservative support for caching & hardware acceleration. Phones are way more sensitive to battery use & heat than a plugged-in laptop. You can't really directly compare a phone to a laptop, even if they seem to have similar hardware numbers. The form factor & use cases matter.


Remember that beautiful, amazing time when pop-up blockers had forever defeated those obnoxious and intrusive ads? My timeline could be off, but I'm thinking somewhere around 2004-2006, when Firefox was really picking up steam. I feel like the current web is in some kind of alternative universe where we've been shifted back in time over a decade. I have to use Ublock and Ghostery to make the web even remotely usable.

Perhaps websites just need to die altogether, and instead we'll just use APIs. You choose how to render the content in your browser according to your desires. Oh wait....RSS.


> Perhaps websites just need to die altogether, and instead we'll just use APIs. You choose how to render the content in your browser according to your desires.

Yeah. Maybe you could download some kind of menu of what the site offers. You could go ahead and download the text because it's tiny, but you could decide, eg, "I will request the image resources but not the javascript ones", or "I will display the text in a font of my own choosing".

The site content could use a language to "mark up" what it offers, and the site's specifications could be overridden by user preferences, sort of a "cascade" of priority.


I found it quite funny that this article ends like this https://imgur.com/SpEp9UF :)


Well, at least Disqus has the merit of actually providing value to people visiting your website.


Yeah, I'm a lazy bastard. My blog's all static files on Amazon S3 with no server side smarts, and I wanted comments. Disqus doesn't suck completely. There's a Google Analytics tag in there, as well as some Twitter code to embed a tweet.


The great thing about advertising is it funds you no matter if you are famous or not. Bad thing is you have to live under their corporate censorship and all the baggage that goes with using third party sellers (privacy and inefficiency).

To use some of the examples in the article, not everyone can rely on national license fees (BBC), corporate sponsorship (NPR), consistently making a loss (The Guardian), search engines (Mozilla), having a legacy business (CNN) etc.


FWIW, my post is not a rant against advertising in general. It's about doing it with such a large volume of code & third parties involved, leaning on my bandwidth & CPU to cover for a lack of coordination.

Toward the end of the post I say something like "what if we actually accepted the fact that ads are a way of funding the web at large" and mention some Mozilla efforts in that direction


It's kind of funny to open up Verge with both uBlock and uMatrix enabled only allowing only 1st party scripts. First I thought that I have to enable some js to view the article, but then I noticed the small scrollbar and scrolled down.

https://imgur.com/a/YVbDE


I concur ... The verge has been out of my feed for a long time .... their site feels 'heavy' and filled with distractions ...


And people laugh at me for disabling JavaScript!


browsers themselves offered built-in mechanisms to support advertising that respect privacy & performance?

I wonder which kinds of proposals you would suggest. It should also cover analytics.


Well, some of the proposals flip the relationship on ads. For example, rather than getting access to fine-grained personal data about readers, a marketer gets some coarse data like general location (eg. midwest US) and language. Using that, they propose a set of ads to the browser, and the browser decides which might be relevant based on what it knows about you. The browser keeps the personal data to itself, though. Marketers get some analytics on the response to their ads, but not full-on tracking.


Sounds like a good idea...I fear though it may be technically impossible.

Say you write/fork/create a browser that can do this, in order to really work, you have to disable (by default) all ads.

To do this, technically, that means you have to break all third party site interaction. No third party scripts, no jquery includes, etc. etc.

I could see it maybe as a value-added service which competes with traditional ad-based marketing, i.e. aimed at folks who have very strict anti-ad systems put into place. Only issue being is those same folks may also not wish to use said browser...

So I don't think this is possible, even if its a nice idea. You'd basically have to rebuild the web to do this. The only way it could flourish would be as an independent network of its own.


Good idea, actually.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: