Google has a long history of focus on web page loading speed. Indeed, PageSpeed Insights, a tool most SEOs will be familiar with, is over a decade old. Speed is important to Google because it's important to users. It's frequently at the top of charts related to aspects of the web experience that matter to people the most (and slow sites are often users' biggest bugbear).

For Google, web page speed is a useful proxy for overall user experience, not least because it's relatively straightforward to measure in bulk. It could also be measured at crawl time and Google's initial focus was on metrics like Time To First Byte (TTFB). However, these technical measures aren't great as they may not actually reflect what a site's users are seeing. Googlebot has a lot of bandwidth, after all.

The logical step was to find some way to monitor actual user experience, which would also allow for additional metrics not necessarily directly related to speed. Luckily for Google, they make the Chrome web browser which has over 60% market share globally.

The Chrome User Experience Report

If you install Google Chrome and the following apply to you, webpage performance stats are sent from your browser to the Chrome User Experience Report (CRUX for short):

  • Opted-in to sync your browsing history
  • Have not set up a Sync passphrase
  • Have usage statistic reporting enabled

This is the option you'll have seen if you've installed Chrome:

Google Chrome usage tracking opt in

The report is aggregate information from many millions of users, and covers millions of sites (called "origins" in CRUX - basically, protocol + subdomain + domain, e.g. https, www, seothing.co.uk). How many sites? Below are the number of sites appearing in CRUX over time:

CRUX was first published with a 10,000 URL sample back in 2017, and has since grown to cover over 8 million sites.

Core Web Vitals

Initially, Google provided data from CRUX via Search Console on a few loading time metrics like First Contentful Paint (the time before meaningful content is displayed to the user). Over time, these have evolved into metrics that go beyond just timings and measure other aspects of a user's experience. The current web vitals metrics are as below:

  • First contentful paint (FCP) - the time taken until any content is displayed to the user
  • Largest contentful paint (LCP) - the time until the largest item of content (e.g. a heading or image) is displayed, likely to be seen by users as when the page is "ready"
  • First input delay (FID) - the delay between a user's click or tap on your page and the page responding
  • Time to Interactive (TTI) - the time until the page can be reliably interacted with
  • Total blocking time (TBT) - the total time the page was unresponsive after the First Contentful Paint has occurred
  • Cumulative layout shift (CLS) - measures the amount a page layout "jumps" or shifts without user input

All of these metrics can affect each other. Google says that three of them are the most important - LCP, FID and CLS. These are the "core" Web Vitals sites should aim to get right. Getting it right means meeting the requirements below:

  • LCP - 2.5 seconds or less is good (4s or less means you "need improvement")
  • FID - 100ms or less (300ms or less for "needs improvement")
  • CLS- 0.1 or less (0.25 or less for "needs improvement")

Are Core Web Vitals a ranking factor?

Both loading speed and user experience are already part of Google's algorithm, although not as hugely significant factors. In May 2021, Core Web Vitals will officially be part of the algorithm, within a page experience update. It is likely that at that point both speed and user experience will become more significant factors. How much of a factor? That's open to debate. Likely page experience will not be a major direct factor, although it's likely to be a significant indirect one. There are cascading effects of offering users a good experience and, in that sense, Core Web Vitals will just be a way to measure how likely it is that your users are happy with your site from a technical perspective.

One of the reasons that page experience is unlikely to be a major factor is that many major sites have poor loading times and user experience, and Google is unlikely to want to demote them in results just because of that. Rumoured but not yet confirmed is that Google may display a label in search results next to sites passing Core Web Vitals with the intention of increasing the clickthrough rate for those sites. Something similar happened when Google introduced AMP, although there is little evidence that the little lightning bolt icon actually worked:

How AMP results display in search results

Are sites meeting Core Web Vitals requirements? How many sites?

The simple answer is no, the overwhelming majority of sites do not meet Google's Core Web Vitals requirements. Only about a quarter of sites in the CRUX database meet the requirements and that number has not been improving over time:

This makes it unlikely that Google can make these specific measures a major ranking factor. More likely, Google's intention is to "nudge" web developers towards creating more performant sites and, as that happens, these metrics can become more important as direct ranking factors. Likely less than 25% of the web as a whole meets these metrics, since the CRUX database covers the most popular sites who are more likely to have infrastructure like CDNs which help with loading speeds.

How do I check if my site meets Core Web Vitals requirements?

There are numerous ways to do this, but the simplest is to run any page from your site through PageSpeed Insights. Once the report is generated, hit the "Show Origin Summary" link:

Show origin summary in PageSpeed Insights

You'll then see a summary of CRUX data for your site, assuming you have enough traffic to be in the database:

CRUX data in PageSpeed Insights

The second best way is to check in Search Console. Once you're logged in, there is a Core Web Vitals link in the "Enhancements" category:

Core Web Vitals reporting in Search Console