Get NitroPack with up to 37% OFF
Checking your site’s Core Web Vitals in PageSpeed Insights or Search Console and seeing a blank section instead of colorful charts is like hitting a brick wall.
That’s because Google’s Core Web Vitals have become essential for site owners who don’t want to guess how users experience a website but rather have the numbers to back it up.
And more specifically, leveraging data about crucial moments, like:
Moreover, page experience is officially a ranking signal in Google Search, so a passed Core Web Vitals assessment not only puts you in front of more people but also helps you engage and convert them better and faster.
So, to what extent does the missing Core Web Vitals data affect your online business? To solve this riddle, you first need to understand the methodology behind Core Web Vitals data sourcing.
Google primarily relies on two sources for collecting this valuable data: the Chrome User Experience (CrUX) Report and Lighthouse audits. These sources offer insights into what website owners can do to enhance the user experience further.
The CrUX (Chrome User Experience) report is a rich source of real-world user experience data. It collects field data from millions of Chrome users as they navigate the web. This extensive dataset encompasses a wide range of over 16 million origins, making it a valuable resource for understanding the broader web performance landscape.
In contrast, Lighthouse is an open-source tool developed by Google used to conduct lab tests of web performance. It simulates user interactions in a controlled environment and provides detailed performance metrics.
Both field and lab data are presented in your Google PageSpeed Insights report.
Field data is derived from real users' experiences as they visit your website in their daily online activities. This data reflects users' actual performance, offering a genuine assessment of a website's user experience.
Represented by the Core Web Vitals assessment in your report, the missing field data is why you’re reading this article.
Lab data is generated in controlled test environments. While it allows website owners to identify and address specific performance bottlenecks, it doesn't capture the variations and nuances of real-world usage.
Pros and Cons of Field Data
Field data, sourced from the CrUX Report, offers several advantages and some limitations.
A significant advantage is its authenticity. Since it represents actual user experiences, it provides a realistic view of a website's performance from a user's perspective. This can be invaluable for identifying critical issues that impact user satisfaction.
Failing your Core Web Vitals assessment is a tell-tale sign you need to focus your attention on your site’s performance if you want to leverage benefits like:
The downsides of field data on Core Web Vitals include:
Nonetheless, the benefits of analyzing field data and optimizing Core Web Vitals for your business outweigh the drawbacks significantly.
If you see no data for your Core Web Vitals in Google Search Console, it might be that your property is new, and the console is still checking the CrUX database.
Not your case? Well, let’s probe deeper.
Clicking on the tooltip next to the “No data available” message in your Google Search Console or Google PSI report reveals the following:
“The Chrome User Experience Report does not have sufficient real-world speed data for this page.”
Simply put, you see no field data because your website hasn’t generated enough traffic on desktop and/or mobile. It’s always worth checking both instances as they are sourced separately.
So, you might be thinking that growing your website traffic should fix the issue, right?
It’s not quite so simple.
The CrUX report aggregates real-world speed data for origins following several essential requirements:
Back in 2021, Martin Splitt from Google further clarified:
So much for hoping for specific numbers.
You should also consider that a website may never become part of the CrUX dataset. When you think about it, CrUX tracks 16 million origins. Seems a lot, right?
However, when compared to the 1.13 billion websites on the Internet today, the CrUX dataset is but a small fraction.
To summarize:
While Google can’t guarantee your website will enter the CrUX dataset so you can analyze your Core Web Vitals based on field data, it doesn’t mean your hands are tied.
Until the CrUX report returns readable data, you can focus on alternative methods like monitoring other performance, server, and network metrics, performance auditing with GTmetrix, and analyzing user feedback and behavior.
Find a bonus tip most site owners don’t leverage at the end ;)
When field data is missing, your next best move is to scroll down in your Google PSI report and start with the lab-based equivalents of Largest Contentul Paint (LCP) and Cumulative Layout Shift (CLS). Since Interaction to Next Paint (INP) doesn’t have a lab-based equivalent, Total Blocking Time is another metric to focus on, along with First Contentful Paint (FCP) and Speed Index (SI).
To reduce TBT, you can:
— Minimize or defer non-essential JavaScript;
— Optimize and limit the use of third-party scripts;
— Utilize web workers to offload heavy tasks;
— Implement asynchronous loading for scripts.
To improve First Contentful Paint, you should:
— Reduce server response times;
— Minimize render-blocking resources;
— Use lazy loading for non-essential resources;
— Reduce JavaScript execution time.
To improve your site’s Speed Index:
— Optimize and compress images and other media files;
— Minimize the use of large, above-the-fold images;
— Implement code-splitting to load only necessary JavaScript on the initial page load.
GTmetrix provides a more extensive set of performance metrics and customization options that will help you build a better optimization strategy.
To reduce TTFB:
— Optimize server and database performance;
— Use content delivery networks (CDNs);
— Minimize the number of HTTP requests;
— Implement browser caching for frequently requested resources.
Generally, when implementing techniques to improve TBT, you should see significant improvements in TTI as well.
Resource Loading Metrics (Waterfall): These metrics encompass the load times of specific resources such as images, stylesheets, fonts, and scripts. Monitoring these in a waterfall chart helps identify bottlenecks in the loading sequence.
While there are no specific thresholds, aim to minimize the load times of critical resources that appear above the fold to achieve an overall faster page load.
To improve resource loading times:
— Compress images and use modern image formats like WebP;
— Optimize and consolidate CSS and JavaScript files;
— Speed up resource loading with priority hints, fetchpriority, and link=rel_preload
A web performance budget is a predetermined limit on various performance metrics that your website should adhere to. These metrics can include load times, page size, the number of HTTP requests, and more. The budget serves as a benchmark, setting clear boundaries for how your website should perform to ensure an optimal user experience.
Here are a few simple steps to help you get started with your first web performance budget:
With or without field data, the quest for a faster, more responsive user experience remains a journey worth taking.
You don’t have to go it alone, though. 180K+ site owners like you delegate performance optimization to the most comprehensive tool on the market – NitroPack.
With advanced features that work on autopilot, you can have optimized images, code, and fonts to offer a lightning-fast user experience and grow your business sustainably.
Lora has spent the last 8 years developing content strategies that drive better user experiences for SaaS companies in the CEE region. In collaboration with WordPress subject-matter experts and the 2024 Web Almanac, she helps site owners close the gap between web performance optimization and real-life business results.