Site speed and performance has been a focus for most web owners for many years and there are case studies demonstarting the link between speed and conversion. For example. COOK increased conversions by 7% by reducing page load time by 0.85 seconds and Mobify found that each 100ms improvement in their homepage’s load time resulted in a 1.11% increase in conversion.
Google’s decision to make speed and user experience a core part of its algorithm sharpened many businesses focus in this area. Whilst it’s a shame this is a driver (we should always ensure sites work well and are speedy), it’s beneficial if customers experience faster websites.
Technology has come a long way, and developers and ecommerce leads now have a new set of metrics to obsesses over: Core Web Vitals.
What is Core Web Vitals (CWV)?
Core Web Vitals comprises three measures of speed:
- LCP: Largest Contentful Paint
- FID: First Interactive Display
- CLS: Cumulative Layout Shift
Our podcast explores each of these in more detail to help you understand what they really mean. Interestingly, a site that feels slow can outperform on CWV metrics vs. a site that feels faster. Customer perception is important, but so is technical delivery.
To help ecommerce teams learn more, we interviewed a recognised expert in this area. His LinkedIn profile is “Empowering agencies & e-commerce to address Pagespeed + Core Web Vitals” to give you an idea of how focused he is!
For our 74th episode, we’re joined by Erwin Hofman, a technical UX/CRO specialist. Erwin explains the three measures in CWV, discusses what impacts performance for each and advises how ecommerce teams can improve their scores.
You can also listen to our episode on CWV via the following:
Key discussion points
- Let’s start with the basics, what are core web vitals and why is everyone talking about them?
- How impactful is the update likely to be and why?
- Talk us through each measure and what is likely to impact performance
- How should businesses and developers be approaching this update?
- Is there a sensible scale for benchmarking performance?
- What are your recommended tools for measuring performance on an on-going basis?
- What’s your view on how ready people are currently?
- Google has said the metrics will evolve over time – where do you see this going?