Google Confirms Chrome Consumption Data Used to Measure Webpage Speed

by Tom AnthonyAPR 9th, 2018

During a conversation with Google’s John Mueller by SMX Munich in March, he explained an interesting little bit of data about how precisely Google evaluates site speed nowadays. It possesses gotten a little bit of interest from people when I stated it at SearchLove NORTH PARK the week after, therefore i followed up with John to clarify my understanding.

The short version is that Google is now using performance data aggregated from Chrome users who've opted in as a datapoint in the analysis of site speed (and as a sign in relation to rankings). This is a positive maneuver (IMHO) as it means we don’t have to treat optimizing site quickness for Google as a separate job from optimizing for users.

Previously, it is not very clear how Google evaluates site speed, and it was generally thought to be measured by Googlebot during its visits - a belief increased by the existence of speed charts browsing Console. However, the onset of JavaScript-enabled crawling managed to get less clear what Google does - they obviously want the most genuine data possible, but it's a hard problem to resolve. Googlebot is not created to replicate how genuine visitors experience a niche site, and so as the task of crawling started to be more complex, it seems sensible that Googlebot might not exactly be the very best mechanism because of this (if it ever was the mechanism).

In this post, I want to recap the pertinent data for this news quickly and make an effort to understand what this might mean for users.

Google Search Console

Firstly, we should clarify our understand of what the "time spent downloading a
page" metric in Google Search Console is telling us. Many of us will recognize graphs like this one:

Until recently, I was unclear about specifically what this graph was showing me personally. But handily, John Mueller involves the rescue once again with a detailed remedy [login required] (hat tip to James Baddiley from Chillisauce.com for taking this to my interest):

John clarified what this graph is showing:

It's technically not "downloading the page" but rather "receiving data in response to requesting a URL" - it's not based on rendering the site, it includes all requests made.

And that it is:

it is the average over-all requests for that day

Because Google could be fetching a very different group of resources every time when it's crawling your website, and because this graph does not consideration for anything regarding page rendering, it isn't useful as a measure of the true performance of your website.

Because of this, John points out that:

Focusing blindly on that number doesn't seem sensible.

With that i quite agree. The graph can be handy for identifying certain classes of backend concerns, but additionally, there are probably better ways that you can do that (e.g. WebPageTest.org, which I’m a huge fan).

Okay, so nowadays we recognize that graph and what it represents, let’s consider the next choice: the Google WRS.

Googlebot & the net Rendering Service

Google’s WRS is their headless browser mechanism based on Chrome 41, which is used for things such as "Fetch as Googlebot" browsing Gaming system, and is increasingly what Googlebot is using when it crawls pages.

Even so, we know that isn’t how Google evaluates internet pages because of a Twitter conversation between Aymen Loukil and Google’s Gary Illyes. Aymen wrote up a weblog content detailing it at the time, but the significant takeaway was that Gary verified that WRS isn't in charge of evaluating site speed:

At the time, Gary was unable to clarify that which was being used to evaluate site performance (perhaps since the Chrome User Experience Report hadn’t been announced yet). It appears as though factors have progressed since that time, however. Google is now able to tell us a bit more, which can take us on to the Chrome User Experience Article.

Chrome User Experience Report

Introduced in October last year, the Chrome Individual Experience Report “is a consumer dataset of key end user experience metrics for top origins on the net,” whereby “performance data included in the report can be from real-world types of conditions, aggregated via Chrome users who have opted-in to syncing their browsing record and have usage statistic reporting enabled.”

Essentially, certain Chrome users allow their browser to report again load time metrics to Google. The statement currently has a public dataset for the most notable 1 million+ origins, though I imagine they possess data for most more domains than are included in the public data set.

In March I was at SMX Munich (amazing conference!), where along with a tiny group of SEOs I got a speak to John Mueller. I asked John about how exactly Google evaluates site quickness, given that Gary had clarified it was not really the WRS. John was kind more than enough to shed some light on the situation, but at that time, nothing was published anywhere.

However, since that time, John has confirmed this information in a Google Webmaster Central Hangout [15m30s, in German], where he clarifies they're using this data along with some other data options (he doesn’t say which, though notes that it's in part because the data set will not cover all domains).

At SMX John as well described how Google’s PageSpeed Insights application now
includes info from the Chrome Customer Experience Report:

The public dataset of performance data for the most notable million domains is also obtainable in a public BigQuery project, if you are into that type of thing!

We can’t be sure what the rest of the elements Google is using are, but we have now find out they are certainly using this info. As I mentioned previously, I as well imagine they are employing data on extra sites than are probably provided in the general public dataset, but this is simply not confirmed.

Focus on users

Importantly, this ensures that generally there are changes you can create to your internet site that Googlebot is not capable of detecting, which are still detected by Google and used mainly because a ranking signal. For instance, we realize that Googlebot does not support HTTP/2 crawling, however now we know that Google should be able to identify the speed improvements you'll acquire from deploying HTTP/2 for your users.

The same holds true if you were to use service personnel for advanced caching behaviors - Googlebot wouldn’t take note, but users would. There are undoubtedly other such examples.

Essentially, this means that there's no longer grounds to worry about pagespeed for Googlebot, and you ought to instead just give attention to bettering things for your users. You still need to focus on Googlebot for crawling requirements, which is a separate task.

If you are unsure where to look for site speed guidance, then you should look at:

  • How fast is fast enough? Next-gen performance optimization - the 2018 edition by Bastian Grimm
  • Site Velocity for Digital Entrepreneurs by Mat
    Clayton

That’s all for the present time! When you have questions, please leave your contact details and I’ll carry out my best! Thanks!

Protecting Users From Extension Cryptojacking

by james wagnerAPR 2ND, 2018

As the extensions ecosystem continues to evolve, we continue to be focused on empowering creators to build innovative experiences while keeping our users as safe as possible. Over the past few months, there has been a growth in malicious extensions that appear to provide useful features on the top, while embedding covered cryptocurrency mining scripts that go in the background without the user’s consent. These mining scripts generally consume significant CPU resources and may severely impact system effectiveness and power consumption.

Until now, Chrome ONLINE STORE policy has permitted cryptocurrency mining in extensions as long as it is the extension’s single goal, and an individual is adequately informed about the mining behavior. Regrettably, approximately 90% of all extensions with mining scripts that designers have attempted to upload to Chrome ONLINE STORE have failed to adhere to these policies, and also have been either rejected or removed from the
store.

Starting today, Chrome Web Store will no longer acknowledge extensions that mine cryptocurrency. Existing extensions that mine cryptocurrency will come to be delisted from the Chrome ONLINE STORE in late June. Extensions with blockchain-related purposes other than mining will still be permitted in the net Store.

The extensions platform provides powerful capacities which may have enabled our programmer community to build a vibrant catalog of extensions that help users get the most out of Chrome. Sadly, these same capabilities have attracted malicious software developers who try to abuse the platform at the trouble of users. This coverage is another step of progress in making certain Chrome users can enjoy the advantages of extensions without exposing themselves to covered risks.

Read more posts

Find out what other people write about their thoughts.