No it isn't. It's designed to make the user experience better on sites that frequently host Google ads (and also often contain a ton of bloat, 3rd party js, poorly constructed DOMs, awful CSS, etc).
The only way Google could proactively "solve" this problem was by creating a "standard", and then also offering to absorb end user traffic for sites that adopted the standard. FWIW, AMP is an open standard not solely owned or contributed to by Google.
7. Looking at the last few merged PRs nearly everyone involved is a Google employee. I realize this could be a coincidence but I'm not going to analyze the whole repo.
8. The TSC is 3/7 Google employees.
Regardless, until Google issues a legally binding release of the project to an independent organization it is owned by Google. The TSC and AC could be removed at Google's whim.
Why is that the only way? Seems like they could easily have achieved the same result by significantly penalizing sites based on load time and number of external requests.
A fast load time when a page is indexed does not guarantee a fast load time when it is served up to the actual viewer. Serving the page from cache is the only way to guarantee that the page will still be fast when the user wants to view it.
Because users want relevant search results much more than fast websites. Google already factors in a website's performance in their rankings, but weighing it too much over content relevance will make search results worse.
If they actually cared that much about making the results "relevant", they wouldn't mix a bunch of irrelevant suggestions into the results page, each marked with "missing: <query_term>" pointing out exactly how they ignored part of the user's request.
By that logic then, what's the point of AMP if Google is saying page load speeds aren't really that big of a factor? Why go through through the trouble of deriving a whole new subset of HTML?
Because users want relevant search results much more than fast websites. Google already factors in a website's performance in their rankings, but weighing it too much over content relevance will make search results worse.
What's the difference between influencing positions and visibility based on AMP support vs overall page performance?
If visibility is influenced by AMP then Google benefits, users using Google services likely benefit, web developers suffer, users not using Google services to view the content continue to suffer (because companies will continue to maintain two versions of the website, a bloated version with 100 external tracking requests that will be shared on twitter/reddit/facebook/hn/etc, and an AMP version that will only appear on Googles services), and the internet as a whole suffer. Whereas if visibility is influenced by page speed+external requests then everyone would benefit.
- AMP is a transparent and unambiguous standard that leaves no uncertainty as to whether you are somehow "performant enough" to qualify for the simple but limited visibility boost (referring to the news carousel)
- AMP prevents important usability problems beyond performance, like page content jumping
- AMP can enable advanced/extreme performance optimizations by default that are somewhat rare in practice (eg. only loading images above the fold) or isn't really possible to do safely/properly without a spec like AMP (eg. preloading content before the user clicks the link without unpredictably disrupting the website's servers) or sometimes avoided due to cost (eg. fast global caching with Google's impressive CDN). Important for users in the developing world.
Addressing your other points:
- Users who don't use Google services don't suffer. AMP is not Google-exclusive, all the major search engines (like Bing, Yahoo, Yandex) are stakeholders in the AMP standard and are free to support AMP. AFAIK there is nothing in the AMP standard that favors Google over other search engines or any other platform that might support AMP.
- Not sure how web developers suffer more from AMP. I'd think web developers would suffer more from trying to wrangle their bloated website performance independently rather than use a standard toolkit that enforces best practices and enables difficult/expensive optimizations out of the box.
- It's not clear to me how the internet as a whole will suffer, but I suspect this is just general hyperbole and not a specific point.
The only way Google could proactively "solve" this problem was by creating a "standard", and then also offering to absorb end user traffic for sites that adopted the standard. FWIW, AMP is an open standard not solely owned or contributed to by Google.
https://amp.dev/