Setup a website

Performance monitoring

6min

Purple Experience



Summary

We are monitoring the overall performance of Purple Websites with both, the synthetic measurement of Core Web Vitals and real user monitoring results provided by CrUX, that includes the same Core Web Vitals measures.

Each of your websites will be included in our monitoring to ensure the correct action is taken based on any impact changes in software or configuration may have.

While setting up your website, you'll mostly rely on synthetic data. Below find a summary of the different measures, including their challenges.

Core Web Vitals

Core Web Vitals are a set of user-centric performance metrics introduced by Google to assess and optimize the user experience on a webpage. These metrics focus on three critical aspects of how users perceive their interaction with a site:

  1. Largest Contentful Paint (LCP): Measures loading performance, specifically the time it takes for the largest content element (text or image) to render. A good LCP score is 2.5 seconds or less.
  2. First Input Delay (FID): Measures interactivity, particularly the time it takes for the page to respond to a user’s first interaction (like clicking a button). A good FID score is under 100 milliseconds.
  3. Cumulative Layout Shift (CLS): Measures visual stability by tracking unexpected layout shifts, which can frustrate users as they try to interact with a site. A good CLS score is less than 0.1.

Together, these metrics are essential in evaluating how quickly a website loads, how responsive it is, and how visually stable the elements are. They contribute directly to user experience, search engine rankings, and conversion rates.

CrUX?

The Chrome User Experience Report (CrUX) is a public dataset that Google collects from real users who opt in to share their browsing data. This data is gathered from users of the Chrome browser and includes information on how websites perform in real-world conditions, across various network speeds, devices, and locations. CrUX utilizes the data points of the Core Web Vitals.

Synthetic vs. Real-User Monitoring

When evaluating website performance, there are two main approaches: synthetic performance measurement and real-user monitoring (RUM), such as CrUX. Both methods offer valuable insights but come with distinct differences and challenges.

Synthetic Performance Measurement

Synthetic monitoring simulates user interactions with a website by running a series of predefined tests in controlled environments. These tests are often executed in specific locations, using defined network conditions, devices, and browsers.

Tools like Google Lighthouse, WebPageTest, and DebugBear typically use synthetic measurement for their reports.

Advantages

Challenges

Controlled Environment: It provides consistency, making it easier to reproduce results and debug performance issues.

Limited Realism: Since synthetic tests run under pre-set conditions, they don’t capture the diversity of real-world scenarios. Network conditions, user device specifications, and geographic location variations aren’t fully replicated.

Granular Insights: Developers can set specific conditions (e.g., test on a 3G network, on a mobile device), offering detailed insights into performance under various setups.

No Human Behavior: Synthetic tests can’t account for real-world behavior like users interacting with multiple tabs, multitasking, or using slow and congested networks.

Customization: Synthetic tests allow the use of custom test scripts, simulating specific user journeys on the site.

Idealized Conditions: Synthetic tests are often run on clean, high-performance systems, which don’t reflect the real-world devices that actual users are likely to use.

Real-User Monitoring

CrUX uses real-user monitoring (RUM) to collect performance data from real visitors, offering a more accurate representation of how a site performs under various real-world conditions.

Advantages

Challenges

Real-World Data: CrUX captures performance metrics from actual users, providing insights into how a website performs across different devices, networks, and locations.

Less Control: Unlike synthetic testing, developers have no control over the conditions under which data is collected. This can make it harder to pinpoint specific issues or recreate scenarios for debugging.

Diverse User Conditions: The data reflects the wide variety of real-world conditions, including slow connections, lower-powered devices, and varying user interactions.

No Granular Testing: Since RUM focuses on aggregate data, developers don’t have the same level of control to run specific, repeatable tests or simulate ideal performance conditions.

Accurate User Experience: Data is captured directly from the browser, so it gives a true picture of user-perceived performance. This helps to understand how a site performs in real-world scenarios that synthetic testing can’t replicate.

Data Delays: CrUX reports are typically delayed, as it takes time to gather and process real-user data, meaning it’s not always the best option for immediate feedback.