DEV Community

Cover image for The Core Web Vital “Largest Contentful Paint” just changed dramatically

The Core Web Vital “Largest Contentful Paint” just changed dramatically

Felix Gessert
CEO of Baqend, a technology for faster websites. Background in distributed database systems and caching research.
Originally published at ・7 min read

In this quick summary we present a finding around the central Google Core Web Vital Largest Contentful Paint (LCP) that is expected to change in a drastic way with the recent release of Chrome 88. Overall, we found that the LCP is likely to drop by as much as 20%. Given Google’s upcoming ranking changes for May 2021, the LCP is more important than ever.

Background: What are the Core Web Vitals?

The Core Web Vitals are an initiative by Google, to provide better metrics to evaluate page performance and user experience. While previously it had already been possible to somehow capture the “felt page speed” of websites using metrics such as the Speed Index (the average time to visibility of page elements) pioneered by WebPagetest or the First Meaningful Paint (the time of the biggest visual change in page rendering), these methods had one severe problem: they rely on a synthetic testing environment and video analysis. So, they cannot be measured through browser APIs and with actual “field data”. Previous metrics that were available from the browser and navigation timing APIs like the time-to-first-byte and the first paint on the other hand do not fully reflect the real experience of speed.

Google introduced three core web vitals: the Largest Contentful Paint (LCP), the First Input Delay (FID) and the *Cumulative Layout Shift *(CLS). For each of these metrics, Google also proposes a “traffic light” essentially putting page views into three buckets good, ok, bad based on thresholds.

The three Core Web Vitals (image taken from []( three Core Web Vitals (image taken from

This is what the Core Web Vitals capture:

  • LCP: measures how fast the page loads. For this, it captures the moment during the rendering process where the largest element containing content (e.g. text of images) is rendering on the user’s screen.

  • FID: measures how quickly the page responds to user interaction. It is evaluated as the time between the first interaction (e.g. click or screen tap) until the event is processed in a JavaScript handler.

  • CLS: measures how stable the page is rendering. The calculation is a bit complicated but evolves around how much content is jumping/moving by what distance during the rendering process. There are ongoing discussions on how to optimize the calculation to best reflect user-felt “annoyingness”.

Naturally, many consider the LCP as the central metric as it reflects the overall loading time of a page. Currently, the Largest Contentful Paint API is available in Chromium-based browsers, so covering around 70% of end users. In a second, we will explain why many websites might have scored badly in the LCP without actually doing anything wrong or being slow.

Measuring the Web Vitals

Today the main sources for the Web Vitals are Lighthouse respectively Page Speed Insights for synthetic (“bot”) performance measurements as well as the Chrome User Experience Report (CRUX). CRUX is a highly valuable public data set that contains anonymous real-user performance data from Chrome browsers for Millions of top domains. It is not only a great tool for performance analysis but also the basis for Google incorporating end-user speed as a ranking factor into its search.

For example, one interesting thing you can do is to rank the top 10 global e-commerce companies by their LCP:

Ranking of top 10 e-commerce players (by revenue) ordered by share of fast LCP page views (source: Pagespeed Insights CRUX data as of Feb 4 2021)Ranking of top 10 e-commerce players (by revenue) ordered by share of fast LCP page views (source: Pagespeed Insights CRUX data as of Feb 4 2021)

Google Page Experience Update is around the corner

The web vitals were presented in May 2020. By the end of last year, Google announced that starting from May 2021 the Web Vitals will become part of the Google Search ranking. Since its Mobile Speed Update from 2018, Google has been considering speed in the rank and — unlike in the old days — speed is no longer measured by the Google bot but instead based on CRUX. Besides the organic ranking, speed also plays a role for SEA/ad placement.

While the quantitative impact of speed on SEO is of course a well-kept secret, there are some studies. For example, Pinterest has seen an additional 15% organic search engine traffic after improving page speed by 40%.

If even before the web vitals update speed has already played such a crucial role, it is very likely to become a lot more important. Many of our e-commerce customers at Baqend for example have recently invested into getting as fast a possible in the LCP before the May update comes.

The finding: for JavaScript-heavy websites the LCP was not accurate

Since a while we noticed, that the LCP seemed to be unreasonably high on some sites that relied heavily on JavaScript for client-side rendering. We then noted that a change to the LCP was launching for Chrome 88 that potentially changes a lot.

Essentially, what we observed was this: when the element that makes up the LCP is changed by JavaScript in the DOM, the LCP is reset even if the page is visually not changing at all. For example, as on this test page, if we change the hero image after 3 seconds, the LCP will be reset and reported to be more than 3 seconds, even though nothing has changed!

Why is this important?

The problem is that many websites rely on client-side JavaScript to modify elements that are defining the LCP. For example, if you change a product image slider in a shop during rendering, you can display the image as early as you want, the LCP will still be bad.

The issue is even more systematic for websites relying on modern frontend frameworks such as React, Angular, Svelte, and Vue: as soon as server-side rendering is used (usually for performance reasons!) the client-side rehydration is changing the DOM and resetting the LCP. Rehydration is simply the process of “booting up JavaScript views on the client such that they reuse the server-rendered HTML’s DOM tree and data” (see this article for more details).

Bottom line: if your JavaScript is modifying your frontend you might have been looking at very skewed LCP numbers.

Give me data: what has changed?

Since Chrome 88 has been out for a few weeks now, we actually have the data to underpin our hypotheses. At Baqend, we are building Speed Kit, a SaaS plugin solution to speed up websites based on JavaScript and Service Workers. Our approach relies on measuring the achieved performance uplift through real-user monitoring and A/B testing. Therefore, we have lots of data, since Speed Kit comes with a performance monitoring tool that measures speed metrics for every page view and in particular captures the Core Web Vitals.

Here are the changes we observed between** January 20 and February 4*, when comparing Chrome browser with version <88 to the current version 88. The data set consists of **122 Million page views* from Chrome browsers across roughly 45 e-commerce customer websites. Around 70% of the traffic comes from Europe the rest is spread across all continents. Overall, 51.1% of traffic came from Chrome browsers with a version <88, 48.9% were coming from Chrome 88.

Analysis of the LCP changes across 122 Million Page Views and different quantiles.Analysis of the LCP changes across 122 Million Page Views and different quantiles.

As expected, the impact varies heavily across customers — this seems logical since not all of them a modifying LCP-relevant elements with JavaScript. The following scatter plot show the impact across all analyzed customers.

Scatter plot of LCP change across customers.Scatter plot of LCP change across customers.

Key Findings:

  • Across the median, 75th, 90th, and 95th percentile, the LCP has consistently become ~20% faster in Chrome 88.

  • For the** 99th percentile** (so the slowest 1% of page views) the change is a whopping 43.75%.

  • Not all websites are alike, however, with very few exceptions the new LCP calculation has led to an improvement. There seems to be *no *correlation between the traffic/size of the site and the LCP change.


From this preliminary analysis, we expect drastic changes in the LCP. Whether this warrants to adapt the — already quite generous — threshold of up to 2.5 seconds considered as a “fast LCP” remains to be seen. For now, the good news is, that JavaScript-heavy sites need not longer fear a negative skew for the LCP. We will get back with an update once the CRUX data (updated once per month) allow a first confirmation of our findings in the “official” data sources.

Discussion (0)