Hacker News new | past | comments | ask | show | jobs | submit login
How to Replace Google's AMP Without Slowing It Down (redfin.engineering)
159 points by dfabulich on Feb 20, 2018 | hide | past | favorite | 117 comments



The AMP brand is only toxic on HN, among iOS users who don't know any better. The article itself perpetuates a common mistaken belief among iOS users — that AMP implements its own scrolling. The problem stems from the root source of all of the author's listed problems — that AMP uses iframes. Mobile Safari had a bug where iframe scrolling is different from scrolling in the rest of the system. On every Android browser, it works just fine.

Similarly, that Mobile Safari's reader mode doesn't work with AMP is simply because it doesn't understand iframes and doesn't work on canonical URLs, unlike every other reader mode (both websites like outline.com and browsers like Firefox) I have used.

The real issue is that Safari is such a buggy, rarely updated browser that publishers have to choose between speed or UX.


(Author here.) I think the AMP brand is toxic roughly everywhere.

Many users don't know what AMP is, and don't really care—in fact, there's some evidence that users like AMP in search results, or at least they like the performance—but as far as I can see in tech communities like HN and Reddit, the vast majority of people who know what AMP is also know that they hate AMP. (Scroll up and down on this thread a bit!)

As for why it's toxic, I don't think we have to agree about that, but it's not just the UI problem. The URL problem (AMP URLs point to google.com instead of the original site) is a real, significant problem in its own right.

The URL problem is the root of the claim that "AMP is Google's attempt to take over the open web."

IMO, if it were just an iOS scrolling issue, AMP would just be controversial: love the speed, hate the scrolling, eh, it's a wash. But the URL problem is what really gets the flamethrowers running!


>in tech communities like HN and Reddit, the vast majority of people who know what AMP is also know that they hate AMP.

which is an extremely tiny portion of people clicking links on the internet.


I'm not saying that a significant portion of the world population hates AMP. I'm saying: take the extremely tiny portion of people who have heard of AMP, and then estimate how many of them like AMP or dislike AMP.

AMP's detractors are few in number, but they overwhelm AMP's supporters, who are even fewer in number.


AMP has a mostly positive reputation in the world of digital publishers--and they definitely know what it is. I've heard from people at two different publishers that the decline in traffic due to the recent Facebook algorithm update has almost been made up by a dramatic increase in Google traffic to AMP pages. (see https://www.axios.com/google-traffic-explodes-doubling-down-...)

The main complaint I hear is that AMP pages are pain in the ass to produce, not anything about walled gardens or canonical URLs. Consider that most of the people making decisions are weighing AMP strategically against FB Instant Pages.


No. As someone who works in media/publishing I’m gonna disagree there.

I can speak for colleagues across many publishers too. Within the industry it’s hated.

There are implications for ad revenue too that I’m not going to go into here.

Don’t get me wrong, some love it but more hate it - for a variety of reasons.


Well if you believe the data from Chartbeat, a whole lot of publishers are going through the non-trivial effort of publishing in AMP even if they don't like it.


It's funny: If you ask what people think about being held at gunpoint they're not usually very keen on it.

But when I hold them at gunpoint and tell them to dance like a chicken, they all do it. Doesn't seem like they hate it after all. Sometimes they even cry tears of joy...


Because your google search rank depends on it


We work with hundreds of publishers and they all hate it. HTML is already fast. The junk added to the page is what makes to slow. A new fork of HTML is just another development resource drain for no real benefit.


Plain HTML sites are faster than having your article loaded instantly by the browser?


Is HN (the site you're on right now) somehow slow? What is this magical instant loading you're talking about? Content still has to be requested, downloaded and parsed.

AMP is a fork of HTML with a required JS framework and a strict set of rules about how other media can be added to the page. AMP sites are also cached on a free Google CDN. None of this is necessary or worth the dev effort for sites, they only do it because of search rankings.

The reason sites are slow is because they add so much stuff to the page that comes from vendors using bloated heavy frameworks and bad coding. Remove or optimize that and sites are perfectly quick.


Google downloads the content in the background before you click on it, so that is why it is fast.

I agree hacker news is super fast (and the back button works perfectly!), as long as you have a reliable internet connection. I use it as my anti-example as to why we don’t need single page apps to read the news.

So I agree: if google can determine your internet seems reliable and the page loads as fast as hacker news from the user’s location, the site should get an amp-style lightening bolt.


> they know what it is.

Well, to the extent they know it is a way of getting their site higher up on search results, they love it.


I've seen a lot of people in less tech savvy communities ask why a non-Google URL points to Google.

They don't know what AMP is but they've encountered the URL problem.


> which is an extremely tiny portion of people clicking links on the internet.

Most people don't know enough about how the Internet works to have an opinion. It's a specialized field, and only people who understand the related issues will tend to have an opinion.


Yeah, but a very high portion of the people who actually make web sites.


> which is an extremely tiny portion of people clicking links on the internet.

... But a large portion of people developing pages on the Internet.


... and a very, very, large portion of the people who are supposed to be developing pages on the Internet but haven't gotten their caffeine levels high enough yet ;)


They're actually going to fix the URL problem once and for all using [Cross-Origin Server Push][1] in the upcoming Signed HTTP Exchanges standard. Here's their blogpost on that: https://amphtml.wordpress.com/2018/01/09/improving-urls-for-...

[1]: https://tools.ietf.org/html/draft-yasskin-http-origin-signed...


(Author here.) The "Signed HTTP Exchanges standard" is part of the Web Packaging standard, which is, indeed, the whole point of the article. https://github.com/WICG/webpackage


The standard authored by J. Yasskin from Google? Self-serving their own interests much?


And every other site that links off.


Saying that the URL problem is Google's attempt at taking over the open web is nonsensical. Google's main competitors in large markets, like Microsoft, Yahoo Japan, and Baidu, also implement AMP caches. How does that help any of them fend off any of the others? Each one of them gets to prerender the same provably prerender-friendly pages.

I am not an AMP publisher or an AMP developer, but I vastly prefer AMP results, and I suspect my preference lines up with the majority outside of HN, or why else would all these search engines do this?


> Saying that the URL problem is Google's attempt at taking over the open web is nonsensical.

The point isn't that only Google can use AMP. (I agree that many people incorrectly think that AMP is Google-only.) The point is that with AMP, you've lost control of your web site.

The fact that you've lost control not only to Google, but also to Microsoft, Yahoo Japan, and Baidu doesn't help at all!


The main bit of this is that Bing and Baidu HAVE to support AMP, because as Google's competitors, they can't allow Google to have any advantage, even if the solution poisons the Internet. And because Google has 87% of all search traffic, if a site is going to support a solution, it's going to be AMP.

Google's playing with nukes here. It's not just people under their domain who are affected, but anyone who wants to stay in the same game as them. Why do other countries look to build nukes? Because the US has them, and if you want to not be taken out by the US, you need them too.


> The point is that with AMP, you've lost control of your web site.

How does Google, etc., caching content and delivering it to users mean that you've lost control of your website?


They control the reddest part of your page's heatmap. They inject JavaScript that does whatever they want. If you show up on a carousel result, for example, they inject JavaScript onto YOUR page that makes a left/right swipe navigate to a competitor. There's more, but yes...you've lost control.


"Caching your content," on a google.com URL, makes it their web site, not yours.


I disagree. Using cloud flare doesn’t make your site belong to cloudflare.


Cloudflare doesn't shove a header at the top of your page and JavaScript with unwanted behavior.

I mentioned the carousel problem in another comment. But the default behavior is suboptimal too. That injected header has an [x] button. Users expect it to dismiss the header. Instead, it navigates away from YOUR page, back to Google.


Does cloudflare change all your urls to theirs?


> The point is that with AMP, you've lost control of your web site.

Did I miss the news article "Armed googlers rampaging wildly and threatening all website owners with violence for noncompliance with AMP guidelines"? Who forces you to use amp?


We are forced by the prominent AMP carousel that shows up in search results that drives meaningful revenue to publishers (but cannibalizes our traffic to the regular page where on a CPM level ads are worth more).

If they removed that then AMP would die quickly.


[flagged]


> Our abusive relationship with you is over. We the users hereby walk away and will never take you back. Stomping you feet won't help.

This is exactly backwards. I don't want an ad company (Google) serving as a middleman for all my web browsing. The most abusive relationship is the one you have little choice in - and Google is such a huge portion of the web that it's nearly impossible to avoid. Now Google is leveraging the inroads they made with AMP in the web to force it into email as well.

To borrow your analogy, this is like letting Exxon Mobil set the environmental regulations for everyone.


AMP is like Amazon's listing of 3rd party merchants' products.

It's great for consumers in the short-term but commodifies businesses into an informational API for Google's UX.

> why else would all these search engines do this?

because lower latency increases user activity and search volume

The question you aren't asking is: "What are the long term implications to the ecosystem when Google controls the UX for all 3rd party data?"


> because lower latency increases user activity and search volume

more importantly, it reduces bounce and helps conversion. neither of which matter much to users...


They each get to put a giant back button on the top of your site that takes you back to the search results instead of deeper into your site. If Google is going to steamroll over independent websites, then the others want in on the action too. If they don't jump in, links to "your website" will all point to google.com rather than the other companies' domains.


> why else would all these search engines do this?

if it expands the business in a positive way who cares what users think, right?


Tech person here, I think AMP is the bee's knees and it should be shoved down media creators' throats. Media property owners have shoved garbage down our throats, complained when we installed ad-blockers to keep from being damaged, and this is sweet revenge.

I'm more than happy to admit that some icon in search results (not the AMP icon as that's already in use) should highlight performant websites but it should be held to as high of a standard (if not higher) than what AMP expects and revoked swiftly when the offending site strays from the expected standards.


What's not to like about giving up control of the most important part of your page to a third party?

Digg tried it, worked well for them.

/s


The AMP brand is toxic because it’s a blatant example of Google trying to fork the web and take control of the new standard. It wasn’t done in collaboration with the community and outreach has been sloppy.

I attended the AMP roadshow and the devs come across as condescending as hell and clearly have no interest in hearing about what the community wants. It was a poorly disguised sales pitch filled with unscientific cases studies.

AMP has some good ideas, but Google can’t be trusted with the future of the web.


No, the real issue is that the iPhone is so widely used that Google should do a better job at making AMP work without breaking a bunch of things.

Or at least have the option to disable it.


This iOS thing is news to me. I’ve always had a great experience with AMP on iPhone Safari. No scroll problems at all.


The scrolling issue is fixed as of iOS 11, which got rid of the inconsistency in scrolling physics between iframes and full pages. See also:

https://news.ycombinator.com/item?id=14386292


Oh, nice. I don’t remember having problems before iOS 11, but I don’t really encounter AMP links all that often anyway.


It’s toxic among anyone who try to copy and paste links from address bars


> among iOS users

The main arguments against AMP have nothing to do with iOS or Safari. I don't have any Apple devices.

Also, how is that the top comment just a minute after being posted on a site where most people don't like AMP?


> Also, how is that the top comment just a minute after being posted on a site where most people don't like AMP?

I think that it's quite possible that there is a false impression the proportion of the HN community that dislikes AMP just because the people that do dislike it have intense need to wave that around, while the people that don't dislike it aren't rushing out to post their "eh, it doesn't really bother me." Or even to question the attacks, as that often fails to produce any kind of substantive response.


No, AMP is garbage in design and in execution.


Oh gee I wonder why AMP works fine on Android and not iOS.


> Yes, AMP pages load fast, but you don’t need AMP for fast-loading web pages. If you are a publisher and your web pages don’t load fast, the sane solution is to fix your [fricking] website so that pages load fast, not to throw your hands up in the air and implement AMP.

I'm a developer and took some time to analyze AMP from a developer's perspective. The above conclusion is the only sane one. AMP, Web Packing Standard, damn proxy - they are all workarounds that complicate things. Don't be lazy - fix your shit and make your website fast. https://medium.com/@martindrapeau/amp-for-developers-the-tec...


(Author here.) You and I agree that AMP has serious problems, but your article doesn't evaluate the Web Packaging standard at all, which is the whole point of the article.

Prerendering is faster than the fastest web site you can build, faster than http://motherfuckingwebsite.com/, as fast as switching tabs.

On noisy cellular networks, where even one network request can take seconds, even the fastest sites can be slow; prerendering can fix this.

I don't think it makes sense to say, "Prerendering makes even the fastest sites faster?! Bah, humbug! Web sites are fast enough if you do them right."

Web sites are pretty fast if you do them right. But making them even faster is hardly lazy.


I fear the inherent problem is that a "lazy" dev team ships/publishes earlier...


The WWW is supposed to be decentralized. There shouldn't be one (or a few) companies that host the Web. I don't want Google to visit sites on my behalf.


(Author here.) That's what's so great about the Web Packaging standard proposed in the article to replace AMP: everybody could use it. https://github.com/WICG/webpackage

Even HN could serve up prerendered Web Packages for all of the sites on the front page.

The result would be a more decentralized web.


Why would other sites be motivated take on the extra bandwith loads?

Downloading websites before even clicking on the links seems like a huge waste of bandwidth for a problem that could be mostly solved by:

- writing better HTML

- stop writing bloated UIs with too much JS and CSS


They'd take on the bandwidth load in order to provide better performance for their users!

Maybe HN wouldn't bother, but I think, say, Reddit would do it if experiments showed that prerendering links resulted in higher engagement on Reddit. (And I bet it would; don't you agree?)


Wouldn't that just give an advantage to large sites with tons of cash at the detriment of smaller, independent publishers? If hosting other websites increases the engagement on Site X, users might say, "getting my news on Site X is faster than getting my news on <small_independent_website>."

You might be right, but something about it sounds off to me. We already have a working system. It's being abused by bad HTML/JS/CSS, but there are ways to fix it without unbalancing the open nature of the WWW.


yet another Thing That We All Must Do in this endless technological death march called progress.

I get sick of companies claiming that whatever stupid remix of existing techniques (wow, page prerendering! I had an app like that for my dial-up modem...) is worthy of the kind of "innovation" that we the people love so, and all the extra work, secondary, and tertiary effects that it entails.

An entire generation grows up in the bazaar, and ends up running for the cathedrals sigh


I wonder if it has to do with the age churn in the business.

Meaning that unless you are self-employed, you are basically out of a job by the time you hit your 30s.

And then they bring in some bright eyed grad, or maybe even some self-taught kid from the street, to take over. And he invariably ends up tossing your work out because it is not in a fashionable language, he has only a superficial understanding of all the edge cases embedded over time, and it is an "old" project.


And how's that working out?

I don't disagree with you, but in practice, the huge waste of bandwidth is happening today. So that hypothetical scenario isn't really useful.


Google controls the incentives, so if they told people to "stop writing bloated UIs" in exchange for the lightning bolt then it would happen. But they are altering the deal and telling people to do AMP. The technical aspects don't matter.


It's bad, but what are you going to do? If Google instead offered a paid CDN product in exchange for the lightning bolt and carousel placement the line would still be out the door.

The technical aspects only matter to Google: lightweight sites keep their hosting costs down and the quality decent. AMP is Google's play at finding a place in the social space, and just like Facebook, publishers are happy to play.


One easy way to "write better HTML" and "stop writing bloating UIs with too much JS and CSS" is to stick with the limited html/js/css that's in AMP.


Sure, but that doesn't get blog.fefe.de any of the AMP ranking/carousel benefits in search, does it?

That site, from a major german blogger, would actually slow down significantly when you add AMP (he kinda tried once), and yet, he doesn't get any of the search benefits.


Er, isn't getting AMP ranking/carousel benefits a different thing from encouraging everyone to not be bloated?

Great that "that site" isn't bloated already. Sucks that he'd have to slow down to get the search benefit. Now, about convincing the entire rest of the Internet to be less bloated, ...


Google and others would presumably cache it.


How does the web packaging spec guarantee that a user isn't served an outdated version?

A site may make a mistake, correct it, and web package caches could still serve the old version otherwise.

In fact, this basically leads to the update denial issue that also caused APT repositories to move to HTTPS.


The package is cryptographically signed by the original web site; the signed package includes a signed expiration date.


In other words, the issue isn't solved at all, and either we end up with expiration dates of a few minutes, and Google crawling every page on the web every 5 minutes, or we're ending up with horribly outdated content, or any mix of the negatives of both.


So don't package sites with volatile content? Or package the client and pull the content dynamically with JS.


>package the client and pull the content dynamically with JS

Please don't do this. Most pages are documents, not interactive web apps.

This bugs the crap out of me when the online documentation for a language/framework/program refuses to render without JS. It's completely unnecessary.


Upvoting this is just not enough. This is the worst aspect of the web currently. Even NASA.gov is just a blank black page without JS enabled.

It's completely unecessary and sacrifices accessibility for ease of development.


Sometimes is not even about ease of development, but just a following of trends without thinking. SPAs are a bad choice for content-based sites for several reasons.

People often ask the wrong question: "which JS framework should I use?" (wrong first question) instead of "should I build a site that requires users to execute JS?"


A better question: how does the spec guarantee that the content isn't fraudulent?

It doesn't seem like there's a way to guarantee that the content was signed by the site on the package. If a hosting site can serve up the content, can't they also serve up the signing, unbeknownst to the end user?


> A better question: how does the spec guarantee that the content isn't fraudulent?

It has to be signed by a certificate for the original source domain, so its the same kind of security that TLS provides.

> It doesn't seem like there's a way to guarantee that the content was signed by the site on the package.

It seems to me it guarantees that the same way your browser guarantees that an HTTPS page was signed by the site on the URL.


Doesn't Google already visit sites on your behalf, to index them so they show up in the search results?


Google Search isn't serving the sites.


AMP can be hosted by 'YOU'. See Guardian.com 's own AMP cache. It is not google taking over the web.

https://amp.theguardian.com https://www.theguardian.com/membership/2016/feb/24/todays-re...


Does google respect self-hosted AMP caches and will link to them instead of theirs in search results?


It would defeat the most important purpose, the pages won't be as fast. So I'm guessing they won't.


And it wouldn’t be secure.


We used to have applications.

Applications were a thing you downloaded once and ran locally. They did things offline. If you needed something extra, it could communicate with a remote host to give you the thing you needed. Eventually, packagers were created to distribute applications more easily.

Then we had web pages.

Web pages were supposed to enable us to browse remote documents. They worked, for a time. But then people either forgot or were annoyed by making applications.

Then we had web applications.

Entire systems of software development combined a web page with an application. You couldn't use it offline, and it didn't browse documents so much as allow you to use an application remotely. The browser would show you an application with which you would browse documents. And it worked, for a time. But then people apparently got tired of distributing apps over a web page.

So now we have web packaging?

If you wanted a universal application platform, make a less shitty form of Java or something. But stop pretending that hypertext is an application, trying to make me jump through hoops to read some text. I don't need my content proxied, prettified, modified, imaged, scripted, stylized, or customized. I don't need it faster, or better, or more private, or more anything. I really just want to read the text of the article I wanted to read. Honestly.

I'm sick of new wheels. I'm sick of progress. I'm sick of advertisements, scripts, pretty fonts, pop ups, pagination, and columns of useless distraction. I'm sick of commentary by ignorant intolerant insensitive paranoid outraged strangers. I'm sick of trying to read a news article and being assaulted by the equivalent of five kinds of media warring for my attention. I'm sick of likes. I'm sick of sharing. I'm sick of people who have seemingly made it their purpose in life to annoy me. I'm sick of technology.


AMP isn't being used for applications, it's being used to deliver mostly static text documents with images. It's pretty bare bones, but people didn't like Google's implementation that works on unmodified browsers, and so now there's a new proposed implementation that requires browsers to implement it natively.

Most of the anti-AMP commentators keep repeating over and over again that people can make documents which render as fast, but the Redfin article essentially points out that this goes beyond stripping out all JS, and having quick HTML parsing. You could reduce parse/layout to 0 and still be slower than AMP.

Do you want to install a native application just to read each article? It doesn't scale. Even simple things like DNS lookups on mobile networks can add hundreds of milliseconds delay (from 80ms at the median to 500ms at the 90th percentile), and the subjective delay causes people to feel the bad.


It takes several seconds to load most web pages on modern browsers because modern browsers have more code in them than my operating system, my browser wants to load everything encrypted, and most web pages are designed to deliver so much unnecessary crap in such an obnoxious way as to make it take significantly longer on purpose just to get to the content.

Google's response is "No worries, we'll just download it for you, screw it up, and then present it to you." Thanks, Google. I didn't ask for that, but that sure never stops you from encroaching ever more into my life.

Incidentally, all ISPs could provide simple proxies that cache and return pages faster, but not with HTTPS. Google being our grand overlord and de facto tech nanny state, they would rather provide this solution themselves than allow ISPs to not do it perfectly (and at the same time siphon up any potential metadata that a carrier could use to monetize traffic going over its infrastructure)

Google didn't invent advertising-driven free services, but they sure as hell perfected it. Most of the annoyances on the web are due to free services - not latency. And mobile providers could prioritize latency over bandwidth, but lower latency does not sell new mobile plans.


We're not talking about AMP vs bloated desktop web pages, we're talking about AMP vs hand-optimized and stripped down simple documents.

Yes, ISPs could provide proxies, if you want your ISPs to be able to track everything you do and then resell your behavior (https://techcrunch.com/2017/03/28/house-vote-sj-34-isp-regul...) , or allow the NSA to snoop on your behavior. You act as if HTTPS was introduced for nefarious reasons and not as a reaction to very real attacks on confidentiality and integrity. The web isn't going to return to the world of 1994, or the idyllic pre-Morris Worm era when we just didn't care about security.

You're sick that the world has gotten a lot more complicated and noisy. I am too, it's part of growing old, get used to it.


I'm talking about AMP vs the news articles and cooking recipes I try to read that get intercepted by AMP.

Google already tracks everything I do and resells my behavior. That is literally their entire business model.

The NSA puts backdoors in hardware modules you can't remove, finds flaws in firmware, cracks crypto, and exploits mobile devices that can't or won't be updated. And HTTPS simply isn't necessary for 90% of the use cases its champions claim.

I've switched my default search engine, changed browsers, unloaded most of my free hosted services to paid ones, and am paying for a newspaper subscription. I am saving money in the process by cutting unnecessary expenditures to pay for these things. I didn't have to get used to it and life is better.


Putting a backdoor in hardware modules is a way more expensive threat model than the low hanging fruit of observing or modifying your network traffic. So your argument is, since the most elaborate state level actor in the world can bypass pretty much any protection you come up with if they focus on you, you should therefore ignore all of the other threats out there?

And by getting used to it, I was referring to the complexity of the whole technology stack, and the sheer amount of defense of depth that has been added to it. You obviously didn't get used to it, because you're complaining about it and ranting about "technology", social media, and many other things, in your top level post.

But if you disconnected from all of this stuff and achieved a zen-like state of nirvana, then "Likes" or proliferation of tech stacks that other people like to create and use shouldn't bother you.

When someone lists a whole bunch of stuff they hate, but then state they've gone cold turkey and freed themselves from it, but still go on an epic unprovoked rant complaining, it sounds to me like someone saying "Get off my lawn and stop playing rap music".

I despise selfies and hate people taking food pictures when I'm eating with them, but it's pretty much the the world is now. That's what I meant by "get used to it", there's no point ranting about it anymore.


> But if you disconnected from all of this stuff and achieved a zen-like state of nirvana, then "Likes" or proliferation of tech stacks that other people like to create and use shouldn't bother you.

I didn't achieve zen. My life is better, but all the other shit still bothers the hell out of me.

Complaining has a long and successful history. Our entire democratic process is based on complaining. Complaining and throwing tea in a harbor. Just because I threw my tea in the harbor doesn't mean the tea isn't still getting taxed, or that my responsibility to speak out goes away. Granted, my form of whining and moaning was more "kids on my lawn" than "taxation without representation", but hopefully people can see the point is that this is all unnecessarily burdensome.


> Do you want to install a native application just to read each article?

We had google reader. It was fast and painless. At least on android.


Websites are at least supposedly are sandboxed so they are not as much of a risk as running native binaries. But this is getting worse and worse though as browsers expose more and more of their host operating system's functionality. The benefits of using a website instead of a native app are quickly disappearing, while the drawbacks have only been somewhat mitigated. We're getting to the point where browsers are worthy of the decades old criticism Emacs has received. They have eventually become an OS with many fine features - simply lacking a good web browser. For the privacy conscious user, modern web technologies will undermine you every step of the way, or simply break if you choose to stand your ground.


How many years will it take until Google introduces “google keywords” instead of DNS?

Mark my words, this non-open, non-standards stuff is a slippery slope. AOL has good intentions as well, and only toward the end went full rent seeking.

Google can solve a lot of its problems by allowing companies to pay for and register keywords so they can make a safe, vetted Internet that advertisers will like without worrying about their brands showing up next to ISIS.


this comment has nothing to do with the article. The author addresses that amp sucks and talks about getting similar functionality without people getting mad at google for being skynet. google isn't trying to replace dns or other core web tech just trying to make delivery lubricated which gets them so much hate. if a cdn introduced AMP it wouldn't be plagued by this phobia of over centralizing.


Good point. I should have described more clearly to make the link that AMP is Google unilaterally exerting more control over the web and making it less open.

If a CDN introduced this, they wouldn’t have the massive market position that Google does. So they’d have to either propose a standard and get adherents or just add it to the pile of stuff that isn’t used. Google is different because of their search dominance.

But I think this that Google is turning into more a controller who isn’t concerned with open standards. The keywords thing was just an example of how much AOL sucked when Google started up.


Its the first time I read about Web Packages[1] but when you combine it with service workers it could really make sense. Just imagine downloading an .wpk and install it to your local environment like a .deb or *.apk package. I think that could be a nice way to solve the omnipresent dependency on servers for PWAs.

https://github.com/WICG/webpackage


Funny, a similar post in the opposite direction from Airbnb is also in the front page right now: https://news.ycombinator.com/item?id=16422763


Or we can simply deliver what the user needs and not include every analytics tracker under the sun, scrolljacking, or shove dozens of images and links in their face to improve "engagement".


(Author here.) Clearly AMP should be replaced, but I argue in the article that prerendering offers better performance even than bare HTML pages with no JavaScript.

"Delivering what the user needs" means performance, too; prerendering simply can't be beat.

The best solution will provide a way to prerender pages without AMP, without iframes, and without violating users' privacy. I claim that Web Packaging is a step in that direction, and even a further step toward re-decentralizing the web.


I would certainly prefer some solution that's not centralized or controlled by a giant advertising firm, but prerendering just isn't needed for most webpages. Especially the kind of webpages that are delivering through AMP.

Shouldn't we be asking ourselves why the hell it takes so long to download and render news articles? Those are generally the kind of pages I see available through AMP. AMP does indeed try to ameliorate that issue, but in totally the wrong way(and without really solving it). Seriously now, the web was designed with the specific purpose of displaying pages of content, and now we should prerender these pages for performance reasons? Something's really wrong here if the answer is to prerender pages as a blanket solution rather than applying applying other performance solutions on a per-need basis.

What are all these sites doing that is so taxing that we need prerendering? How about just delivering text, minimal styling, and some scripts to help lazy-load and download optimized images? Virtually all the sites I see are delivering through AMP could easily do just that. Instead, their sites are littered with trash including multiple versions of jQuery, code that doesn't even execute on the given page but on others, ads galore, custom GA scripts, Mixpanel, Bootstrap, videos, etc. All trash, and poorly implemented trash at that.

Web Packaging IS better, I'll grant you that. I propose that, before we go down that road, our industry take good long look at itself. If all this other optimization that we're doing when we bundle and ship our code isn't working, then we're failing right out of the gate, and we don't need to add another optimization as fuel to the dumpster fire.


Prerendering isn't necessary here either.

Usually you can get the above the fold content rendered in 2x latency + 10ms. That's around 3 to 5 frames.

Prerendering is only useful for sites where either large amounts of data need to be transmitted, or parsing and rendering is slow.

If parsing and rendering is too slow for realtime usage, we need to fix that - not try to circumvent it.

You can get a full JVM up, an entire major codebase parsed, JITedd, profiled, optimized, JITed again, running, and finishing in 40ms.

Yet we're seeing several seconds to parse some JS + HTML + CSS?

That's the true bottleneck.


"2x latency" can be pretty slow on noisy cellular networks. Especially with TCP exponential backoff, latency can be a second or more. That's where prerendering really shines.

(There's a reason why AMP, "Accelerated Mobile Pages," rolled out on mobile devices and not desktop.)


Ironically, mobile users are also more likely to be on a plan with a low data cap, so prerendering might not be in the user's best interest.


There is the Opera Mini way of pre-rendering though.

But that requires that you trust Opera to not do something naughty.


That's why you want to transmit all above the fold content before the exponential backoff really kicks in.

And prerendering is horrible on mobile networks — data caps are already insanely low, and this is reducing their usabile amount even further.

Personally, I test my websites and apps on a Moto G (2014) on a 2G network throttled to 64kbps, using the worst network available in Germany — and on an emulator giving the phone the entire power of an i7-6700, 16GB ram, and a 100Mbps line — and prerendering provides no meaningful performance benefit in either case.

(I'm currently still trying to improve performance on https://quasseldroid.info/, but it's as much as I've got for now)


> Prerendering is only useful for sites where either large amounts of data need to be transmitted, or parsing and rendering is slow.

Agreed. Many of my leads for https://www.prerender.cloud/ end up not using the service after we discover the root cause of slowness was due to an initial, slow, render-blocking XHR request, or in some cases infinite loops that eventually get cut off.

AMP seems to allow sidestepping root cause performance issues by creating a pure version just for Google - it's a growth hack - "don't bother refactoring your app (which would require overcoming organizational inertia), just make a streamlined version for us."


You aren’t prerendering in the same way google is. They pretender in the browser, so zero latency when you click on an amp link.


history has shown that no, we can't do that.

It'd be awesome if we could, but without the motivation of preferential placement in search results for pages that follow a specific set of rules, publishers will include as many advertising and tracking platforms as they possibly can.


It would be awesome but we don't live in this kind of utopia. In the meantime AMP is a good way to clean up pages.


But doesn't it then just eventually become an excuse not to bother cleaning up pages?

Personally I'm just wondering where the legitimate demand for this (slightly faster page loading) is coming from: Consumers or ad/data networks?


>But doesn't it then just eventually become an excuse not to bother cleaning up pages?

well the issue is that website don't do that

>Consumers or ad/data networks

both ?


Exactly, so instead of doing something to promote cleaning up pages properly, a band-aid is applied and nobody tries...

> both ?

They are competing objectives, so which has the bias is indicative of business priorities.


Page speed is already taken into account for ranking.

Google is never going to do that drastically though .. better the result people are looking for even if the page is slow than something fast but barely related.

I am not sure what else they could do in that regard, and the ecosystem has been moving towards more and more bloat since pretty much the beginning.

I am all hears for other solutions to page speed, especially since AMP is tailored to a specific use case, I just don't know exactly what would be enough of an incentive.


I think the alleged performance improvements are only an illusion. The device still has to use data and electricity, in fact more because it is fetching a bunch of things you probably don't want.

Link prefetching is already a thing, so I don't see why we need another standard to resolve this.

Also the privacy argument is moot, I believe. In the questionable example of the AIDS patient, you are instead trusting Google with this info. That is not any better to me.

I'm a front-end developer and I see so many sites bogged down with many analytics and tracking providers, some multiple times. I can't help but feel like AMP is an attempt to maintain the tracking and analytics status quo, while giving the illusion of performance, instead of just chilling out on the multiple redundant client-side analytics.


I have an awful ISP - having a local mirror is a massive improvement.

It almost seems like the web is unusable aside from companies large enough to have a mirror here.


> There’s no way to work around this privacy problem while allowing your browser to visit 10 random sites.

Wrong. Browsers could intelligently delete cookies after a short time, which is what Apple is doing with Intelligent Tracking Prevention[0].

More sophisticated users can install browser extensions that allow them to do this manually[1].

[0]: https://webkit.org/blog/7675/intelligent-tracking-prevention...

[1]: https://addons.mozilla.org/en-US/firefox/addon/cookie-autode...


Lost me at "AMP haters"...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: