Hacker News new | past | comments | ask | show | jobs | submit login
Google adding “Fast page” label to Chrome browser (coywolf.news)
8 points by ziodave on Aug 28, 2020 | hide | past | favorite | 4 comments



This is such a load of bollocks!

Just for the 'lulz', I once ran one of my blog sites through Google's website speed analyser thingy. I'm quite proud of the leanness of the site in question, as it's a pretty lightweight SSG which uses no external resources, and has some very minimal hand-written JS and CSS [so no bloated frameworks].

Needless to say Google informed me that the site was not optimised and recommended a load of improvements, most of which seemed to consist of loading a heap of junk from Google which would have increased the size of the site many times over.

AMP needs to die and this proposed "Fast Page" banner in Chrome is just another example of Google trying to edge out websites which don't contribute to their near monopoly on search indexing and advertising.


I tried Google's PageSpeed Insights (is that what you were using?) for the following sites. In at least one case it recommended removing Google Analytics to improve speed.

(Score) URL

(100) http://esp32.rosskoepke.com/

(100) https://motherfuckingwebsite.com/

(100) http://bettermotherfuckingwebsite.com/

(100) http://news.ycombinator.com/

(98) https://discourse.joinmastodon.org/

(91) https://docs.python.org/

(85) https://joinmastodon.org/

(69) https://doc.rust-lang.org/stable/book/

(46) http://news.google.com/

(17) http://www.medium.com/

(11) http://www.reddit.com/

It's certainly something where reasonable people could disagree but to me these scores match up well with my own personal perceptions - the Rust docs website is about the most sluggish I'd like a website I use regularly to be and it's in the middle of the "orange zone". Also it notably issues failing scores for it's own property (Google News) as well as Medium, which uses Google Analytics.

I looked through the suggestions it provided for each of these websites, and the only recommendation I saw for using Google/Alphabet properties or technologies was to use to:

"Serve images in next-gen formats (save 3.3 seconds): Image formats like JPEG 2000, JPEG XR, and WebP often provide better compression than PNG or JPEG, which means faster downloads and less data consumption. Learn more." It then listed all the images and their sizes under PNG (current format served by Reddit) vs. their size using one of these compression formats. WebP is a Google technology, but JPEG 2000 and JPEG XR are not.

For the poorest performer, Reddit, it recommended that as well as:

Avoid an excessive DOM size (currently 1,663 elements): Consider using a “windowing” library like `react-window` to minimize the number of DOM nodes created if you are rendering many repeated elements on the page. Also, minimize unecessary re-renders using shouldComponentUpdate, PureComponent, or React.memo and skip effects only until certain dependencies have changed if you are using the Effect hook to improve runtime performance.

For the discussion forum, https://discourse.joinmastodon.org/ the tool only had 2 recommendations:

Remove unused CSS (1.35 s) Remove dead rules from stylesheets and defer the loading of CSS not used for above-the-fold content to reduce unnecessary bytes consumed by network activity

Eliminate render-blocking resources (1.08 s) Resources are blocking the first paint of your page. Consider delivering critical JS/CSS inline and deferring all non-critical JS/styles.

For https://motherfuckingwebsite.com/ it's top time-saving recommendation was....to remove Google Analytics.

Avoid long main-thread tasks (2 long tasks found) Lists the longest tasks on the main thread, useful for identifying worst contributors to input delay.

URL Duration

/analytics.js(www.google-analytics.com) 105 ms


It might have been PageSpeed that I used. I can't remember for sure. It was a long time ago, back when Google were first pushing their AMP stuff.

If it was that tool I used, maybe they've improved it since then, or maybe it works better on more complex sites. As I said, my site was very lean and dependency-free and the suggestions offered [at the time] would have made it more bloated, slower and relying on external dependencies.

EDIT: Dredging my memory, I think one of the 'problems' identified was the recommendation you've mentioned above to use more modern image formats.

Ironic because I'd only recently shaved quite a bit off the size of the imagery on the site by redesigning several of them [including the main banner] in svg format. I think Google also recommended I use image srcset [0] to optimise the images the site displayed, dependent on the end-user's screen size. That would have not only made the site bigger and arguably slightly slower but would also have required production of lots of extra image assets, whereas the svg approach I'd used actually made the site smaller and, being vector-based, svg can be served at any size required.

As I say, they may have improved the tool since then. But I got the impression at the time it was pretty crude. Seemed it just checked whether the site employed X and, if not, recommend that it should, regardless of the fact the site might actually implement Y instead, which was even more efficient.

[0] https://developer.mozilla.org/en-US/docs/Web/API/HTMLImageEl...


Is the measure of speed the one you’ll get from running a lighthouse report in Chrome or is it referring to some other speed ranking metric they planning to use?

In other words, how do you measure your page to see if it counts as a “fast page”?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: