We've come a long way since the early days of performance monitoring and optimization. When I look at the articles and videos in this month's newsletter, what they all have in common is that they're about fine-tuning how we measure and fix performance. Whether we're refining the metrics we focus on – or how we use the tools we have at our disposal – it's wonderful to me that we're able to look at web performance in nuanced ways that we simply weren't capable of ten years ago.
It's also deeply satisfying to be reminded that the work we all do actually helps users (and our employers). I'm very excited to be including a batch of fresh new case studies in this month's newsletter. I hope you find them as inspiring as I do.
Until next time,
Lighthouse v6 has arrived!
The much-anticipated update to Lighthouse is now available to SpeedCurvers as part of our latest test agent updates. In this support article, Joseph (@Joseph_Wynn) explains what this update means and how it may affect your performance metrics.
Are JS long tasks frustrating your users?
- Long Tasks
- Number of Long Tasks
- Longest Task
- Total Blocking Time
Subscribe to our YouTube channel
We're proud of our Support Centre, which contains tonnes of useful articles to help you get the most out of using SpeedCurve. But as reader-friendly as it is, we totally understand that videos work better than articles for some folks.
That's why Cliff (@cliffcrocker) has been busy creating awesome explainer videos that demonstrate everything from tracking third parties to monitoring Core Web Vitals. The videos manage to convey a lot of helpful information in a concise format.
Check them out on our YouTube channel – and feel free to reply to this email and suggest other videos you'd like to see.
Why are my WebPageTest scripts not working?
What can I do about outlier test results in my charts?
Why do I see different results, depending on which tool I use?
Sometimes things go wrong or are unclear. That's why we compiled this collection of troubleshooting FAQs for Synthetic and LUX.
What do the different SpeedCurve metrics represent?
We track a LOT of metrics for you in both Synthetic and LUX. Keeping them all straight can be challenging! We maintain a glossary of the most popular metrics for your reference. You can also click on the question mark icon in any chart to see a definition of all the metrics represented in that chart.
ICYMI: Bookmark & compare tests
Last month, we rolled out our new Compare dashboard, which makes it easy for you to bookmark and compare different Synthetic tests in your test history. Now you can generate side-by-side comparisons that let you not only spot regressions, but easily identify what caused them.
From the #webperf community...
Manuel Garcia shares how his team at Farfetch analyzed more than a million pageviews and found that conversion rate decreased by 2.7% for each additional second of TTI on mobile, and 2.1% on desktop.
Chris Coyier at CSS Tricks discusses how to go beyond generic metrics to measure what really matters on your pages.
- The performance team at GOV.UK enabled HTTP2 and were able to measure that this significantly improved performance. Among other things, they found that, for users with slower devices and connections, start render improved by 2.5 seconds and page load time by 6 seconds.
Matt Hobbs shares a straightforward, pragmatic walkthrough of how he set up the performance monitoring dashboards – including configuring test settings and creating performance budgets and alerts – for GDS.
- In the latest episode of the Chasing Waterfalls podcast, Tim Kadlec chats with Sharell Bryant. She shares how Teachers Pay Teachers went from performance "after the fact" to a company that puts a priority on performance from start to finish.