Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No it isn't. It's designed to make the user experience better on sites that frequently host Google ads (and also often contain a ton of bloat, 3rd party js, poorly constructed DOMs, awful CSS, etc).

The only way Google could proactively "solve" this problem was by creating a "standard", and then also offering to absorb end user traffic for sites that adopted the standard. FWIW, AMP is an open standard not solely owned or contributed to by Google.

https://amp.dev/



> FWIW, AMP is an open standard not solely owned or contributed to by Google.

> https://amp.dev/

amp.dev is owned and controlled by Google. ampproject.org is owned and controlled by Google. The core AMP team are Google employees.

How can you possibly say it's not owned by Google?


I don't believe either of these statements is true: "ampproject.org is ... controlled by Google. The core AMP team are Google employees."

https://blog.amp.dev/2018/09/18/governance/

The TSC is independent and, at this point, the committee code commits are almost 4x the volume of Googler commits.


1. Do a whois on those domains and you'll see they're owned by Google.

2. The privacy policy on amp.dev is Google's.

3. Both sites are hosted by Google.

4. The license in the amphtml repo says copyright Google.

5. The OWNERS.yaml file in the amphtml repo list 3 people, all of whom work for google.

6. Per the contributing code readme, contributing code requires signing this Google CLA: https://cla.developers.google.com/about/google-individual

7. Looking at the last few merged PRs nearly everyone involved is a Google employee. I realize this could be a coincidence but I'm not going to analyze the whole repo.

8. The TSC is 3/7 Google employees.

Regardless, until Google issues a legally binding release of the project to an independent organization it is owned by Google. The TSC and AC could be removed at Google's whim.


So the solution to the shitty state of google ads is to submits to google and implement their new shitty product?


Why is that the only way? Seems like they could easily have achieved the same result by significantly penalizing sites based on load time and number of external requests.


This question cuts right to the heart of the matter.

I've yet to see a Google engineer, executive, or "fanboy" address this question adequately.

This thread will be no exception. Queue the crickets.


A fast load time when a page is indexed does not guarantee a fast load time when it is served up to the actual viewer. Serving the page from cache is the only way to guarantee that the page will still be fast when the user wants to view it.


Because users want relevant search results much more than fast websites. Google already factors in a website's performance in their rankings, but weighing it too much over content relevance will make search results worse.


If they actually cared that much about making the results "relevant", they wouldn't mix a bunch of irrelevant suggestions into the results page, each marked with "missing: <query_term>" pointing out exactly how they ignored part of the user's request.


By that logic then, what's the point of AMP if Google is saying page load speeds aren't really that big of a factor? Why go through through the trouble of deriving a whole new subset of HTML?


I discuss this in another reply chain: https://news.ycombinator.com/item?id=19681408


Because users want relevant search results much more than fast websites. Google already factors in a website's performance in their rankings, but weighing it too much over content relevance will make search results worse.


What's the difference between influencing positions and visibility based on AMP support vs overall page performance?

If visibility is influenced by AMP then Google benefits, users using Google services likely benefit, web developers suffer, users not using Google services to view the content continue to suffer (because companies will continue to maintain two versions of the website, a bloated version with 100 external tracking requests that will be shared on twitter/reddit/facebook/hn/etc, and an AMP version that will only appear on Googles services), and the internet as a whole suffer. Whereas if visibility is influenced by page speed+external requests then everyone would benefit.


Some differences:

- AMP is a transparent and unambiguous standard that leaves no uncertainty as to whether you are somehow "performant enough" to qualify for the simple but limited visibility boost (referring to the news carousel)

- AMP prevents important usability problems beyond performance, like page content jumping

- AMP can enable advanced/extreme performance optimizations by default that are somewhat rare in practice (eg. only loading images above the fold) or isn't really possible to do safely/properly without a spec like AMP (eg. preloading content before the user clicks the link without unpredictably disrupting the website's servers) or sometimes avoided due to cost (eg. fast global caching with Google's impressive CDN). Important for users in the developing world.

Addressing your other points:

- Users who don't use Google services don't suffer. AMP is not Google-exclusive, all the major search engines (like Bing, Yahoo, Yandex) are stakeholders in the AMP standard and are free to support AMP. AFAIK there is nothing in the AMP standard that favors Google over other search engines or any other platform that might support AMP.

- Not sure how web developers suffer more from AMP. I'd think web developers would suffer more from trying to wrangle their bloated website performance independently rather than use a standard toolkit that enforces best practices and enables difficult/expensive optimizations out of the box.

- It's not clear to me how the internet as a whole will suffer, but I suspect this is just general hyperbole and not a specific point.


How is that relevant?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: