Today at Google I/O 2023, it was announced that Interaction to Next Paint (INP) is no longer an experimental metric. INP will replace First Input Delay (FID) as a Core Web Vital in March of 2024.
It's been three years since the Core Web Vitals initiative was kicked off in May 2020. In that time, we've seen people's interest in performance dramatically increase, especially in the world of SEO. It's been hugely helpful to have a simple set of three metrics – focused on loading, interactivity, and responsiveness – that everyone can understand and focus on.
During this time, SpeedCurve has stayed objective when looking at the CWV metrics. When it comes to new performance metrics, it's easy to jump on hype-fuelled bandwagons. While we definitely get excited about emerging metrics, we also approach each new metric with an analytical eye. For example, back in November 2020, we took a closer look at one of the Core Web Vitals, First Input Delay, and found that it was sort of 'meh' overall when it came to meaningfully correlating with actual user behavior.
Now that INP has arrived to dethrone FID as the responsiveness metric for Core Web Vitals, we've turned our eye to scrutinizing its effectiveness.
In this post, we'll take a closer look and attempt to answer:
Onward!
It's easier to make a fast website than it is to keep a website fast. If you've invested countless hours in speeding up your pages, but you're not using performance budgets to prevent regressions, you could be at risk of wasting all your efforts.
In this post we'll cover how to:
This bottom of this post also contains a collection of case studies from companies that are using performance budgets to stay fast.
Let's get started!
There is a lot of excitement in the world of web performance these days, and April has been no exception! At SpeedCurve, we've been focused on staying on top of the items that affect you the most.
Here is a look at what's new in SpeedCurve:
All of this work driven by the community is having a big impact in our collective goal to make performance accessible for everyone.
Read on to learn more about these exciting changes!
Things have been busy over here at SpeedCurve HQ! Coming off of the back of our latest RUM Compare dashboard release, we are super excited to launch four new dashboards to make your life better, your work easier, and your websites faster.
Let's take a look!
"I made my pages faster, but my business and user engagement metrics didn't change. WHY???"
"How do I know how fast my pages should be?"
"How can I demonstrate the business value of performance to people in my organization?"
If you've ever asked yourself any of these questions, then you could find the answers in identifying and understanding the performance plateau for your site.
The performance plateau is the point at which changes to your website’s rendering metrics (such as Start Render and Largest Contentful Paint) cease to matter because you’ve bottomed out in terms of business and user engagement metrics.
In other words, if your performance metrics are on the performance plateau, making them a couple of seconds faster probably won't help your business.
The concept of the performance plateau isn't new. I first encountered it more than ten years ago, when I was looking at data for a number of sites and noticed that – not only was there a correlation between performance metrics and business/engagement metrics – there was also a noticeable plateau in almost every correlation chart I looked at.
A few months ago someone asked me if I've done any recent investigation into the performance plateau, to see if the concept still holds true. When I realized how much time has passed since my initial research, I thought it would be fun to take a fresh look.
In this post, I'll show how to use your own data to find the plateau for your site, and then what to do with your new insights.
Every year feels like a big year, and 2022 has been no exception. Not only did we celebrate our ninth birthday (!!!) we also:
Keep reading for a full recap of the past year...
Exploring real user (RUM) data can be a hugely enlightening process. It uncovers things about your users and their behavior that you never might have suspected. That said, it's not uncommon to spend precious time peeling back the layers of the onion, only to find false positives or uncertainty in all that data.
At SpeedCurve, we believe a big part of our job is making your job easier. This was a major driver behind the Synthetic Compare dashboard we released last year, which so many of you given us great feedback on.
As you may have guessed, since then we've been hard at work coming up with the right way to explore and compare your RUM datasets using a similar design pattern. Today, we are thrilled to announce your new RUM Compare dashboard!
With your RUM Compare dashboard, you can easily generate side-by-side comparisons for any two cohorts of data. Some of the many reasons you might want to do this include:
Let's take a tour...
Labeling your pages in your synthetic and real user monitoring (RUM) tools is a crucial step in your performance monitoring setup. We recently released some exciting new capabilities for labeling your RUM pages that we want to share with you. This is also a great opportunity to reiterate why page labels are important, and to show you how easy it is to apply labels to your pages.
There are so many benefits to labeling your pages in both your synthetic and real user monitoring (RUM) tools. Page labels give you the ability to:
Ready to learn more? Let's get to it!
One of the great things about Google's Core Web Vitals is that they provide a standard way to measure our visitors’ experience. Core Web Vitals can answer questions like:
Sensible defaults, such as Core Web Vitals, are a good start, but one pitfall of standard measures is that they can miss what’s actually most important.
Largest Contentful Paint (LCP) makes the assumption that the largest visible element is the most important content from the visitors’ perspective; however, we don’t have a choice about which element it measures. LCP may not be measuring the most appropriate – or even the same – element for each page view.
In the case of a first-time visitor, the largest element might be a consent banner. On subsequent visits to the same page, the largest element might be an image for a product or a photo that illustrates a news story.
The screenshots from What Hifi (a UK audio-visual magazine) illustrate this problem. When the consent banner is shown, then one of its paragraphs is the LCP element. When the consent banner is not shown, an article title becomes the LCP element. In other words, the LCP timestamp varies depending on which of these two experiences the visitor had!
What Hi Fi with and without the consent banner visible
It's been another busy month here at SpeedCurve! Check out our latest product updates below.