Hacker News new | past | comments | ask | show | jobs | submit login

Why would other sites be motivated take on the extra bandwith loads?

Downloading websites before even clicking on the links seems like a huge waste of bandwidth for a problem that could be mostly solved by:

- writing better HTML

- stop writing bloated UIs with too much JS and CSS




They'd take on the bandwidth load in order to provide better performance for their users!

Maybe HN wouldn't bother, but I think, say, Reddit would do it if experiments showed that prerendering links resulted in higher engagement on Reddit. (And I bet it would; don't you agree?)


Wouldn't that just give an advantage to large sites with tons of cash at the detriment of smaller, independent publishers? If hosting other websites increases the engagement on Site X, users might say, "getting my news on Site X is faster than getting my news on <small_independent_website>."

You might be right, but something about it sounds off to me. We already have a working system. It's being abused by bad HTML/JS/CSS, but there are ways to fix it without unbalancing the open nature of the WWW.


yet another Thing That We All Must Do in this endless technological death march called progress.

I get sick of companies claiming that whatever stupid remix of existing techniques (wow, page prerendering! I had an app like that for my dial-up modem...) is worthy of the kind of "innovation" that we the people love so, and all the extra work, secondary, and tertiary effects that it entails.

An entire generation grows up in the bazaar, and ends up running for the cathedrals sigh


I wonder if it has to do with the age churn in the business.

Meaning that unless you are self-employed, you are basically out of a job by the time you hit your 30s.

And then they bring in some bright eyed grad, or maybe even some self-taught kid from the street, to take over. And he invariably ends up tossing your work out because it is not in a fashionable language, he has only a superficial understanding of all the edge cases embedded over time, and it is an "old" project.


And how's that working out?

I don't disagree with you, but in practice, the huge waste of bandwidth is happening today. So that hypothetical scenario isn't really useful.


Google controls the incentives, so if they told people to "stop writing bloated UIs" in exchange for the lightning bolt then it would happen. But they are altering the deal and telling people to do AMP. The technical aspects don't matter.


It's bad, but what are you going to do? If Google instead offered a paid CDN product in exchange for the lightning bolt and carousel placement the line would still be out the door.

The technical aspects only matter to Google: lightweight sites keep their hosting costs down and the quality decent. AMP is Google's play at finding a place in the social space, and just like Facebook, publishers are happy to play.


One easy way to "write better HTML" and "stop writing bloating UIs with too much JS and CSS" is to stick with the limited html/js/css that's in AMP.


Sure, but that doesn't get blog.fefe.de any of the AMP ranking/carousel benefits in search, does it?

That site, from a major german blogger, would actually slow down significantly when you add AMP (he kinda tried once), and yet, he doesn't get any of the search benefits.


Er, isn't getting AMP ranking/carousel benefits a different thing from encouraging everyone to not be bloated?

Great that "that site" isn't bloated already. Sucks that he'd have to slow down to get the search benefit. Now, about convincing the entire rest of the Internet to be less bloated, ...


Google and others would presumably cache it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: