Performance is UX: What Web Performance Optimization Can Do for UX Researchers

Understand how web performance metrics connect to UX research outcomes.

In my talk, Making Performance Allies, I outline how some job roles have great alignment with what I do as a web performance specialist. The natural alignment between UX Research (UXR) and Web Performance Optimization (WPO) is one that I single out as one of the strongest in the technology space.

Unfortunately, we are often separated by a language barrier of jargon. While researchers talk about cognitive load and friction, performance specialists talk about main-thread blocking and speed metrics for the 75th percentile and above. This article is the exploration of our relationship because we ultimately care about the same thing, the psychological contract between a human and a screen.

The two lenses: lab vs. field

To collaborate, we first need to understand how we see the user. One of the main differences with UXR and WPO is that UX deals with qualitative and quantitative data, whereas WPO deals with quantitative data at a large scale. Performance specialists use two primary lenses:

  1. Lab data (aka synthetic data): these are controlled experiments. We run a test on a specific device and connection (e.g., a mid-tier Android on a slow 4G network in the western US). It’s the usability lab of performance. These tests are repeatable, clinical, and perfect for benchmarking.
  2. Field data: this data comes from specialized analytics packages known as Real User Monitoring (RUM). Field data collection is handled by adding a small snippet of code to a website or app, similar to how you’d install Google Analytics. Field data measures the real-world experience of actual users and includes large sample sizes, often millions of users, over a diverse range of devices, connection speeds, and geographic locations.

The UXR connection: if your usability study shows ease of use is high, but your field data shows breakdown in performance for users in rural areas, your study has a sampling bias. Performance data provides the macro-context for your UX observations.

Field and lab data is not a one-or-the-other choice. We rely on both to diagnose and analyze where an application is performing well and where it isn’t.

An explanation of performance metrics for UX Researchers

Google has a set of metrics known as Web Vitals that are applicable to almost every type of site or web application. These are a great place to start with understanding the basics.

  • Largest Contentful Paint (LCP) measures the amount of time it takes for the largest element on the page to load. The reason we track this is because the largest element on screen is typically the most important piece of content for the user.
  • Cumulative Layout Shift (CLS) measures how much the layout of the page moves around on the user. If you’ve ever inadvertently clicked on an ad because it pushed the page down at the same moment you tried to click something else, you have experienced a layout shift. This is the UI equivalent of the timeless uncle trick of “up high, down low, too slow” where your uncle pulls his hand away as you go to give him five.
  • Interaction to Next Paint (INP) measures how long it takes for a visual update to happen after a user interacts with the page. Think of it this way: a user clicks a button, taps a form field, or types in a search box. INP measures how long it takes from that moment until something visible changes on the screen, such as a dropdown opening, text appearing, or any other visual feedback. Sites with a high INP feel laggy, slow, and frustrating.

LCP and INP are both time-based metrics, while CLS is a decimal number. In all cases, the lower the number, the better the user’s experience. There are many more metrics, but these are the most important ones for UX Researchers to understand.

Translating UXR metrics to web performance

UXR metrics are often aligned with web performance metrics, meaning that improvements in performance metrics have an impact on the things UX Researchers care about too.

UXR Metric Performance correlation Where is the alignment?
Time on Task LCP and INP Users cannot begin a task until they see meaningful, important content. High LCP = High friction for task initiation, and high INP causes further delays.
User Error Rate CLS When a page jumps while loading and a user mis-clicks, that is a system-induced user error. Measuring CLS is essentially measuring the probability of a physical slip.
Task Success Rate INP If a user taps "Submit" and the UI doesn't respond instantly, they often abandon the flow or double-tap. Poor INP leads directly to lower task completion and higher abandonment.
Cognitive Load Long Tasks / Long Animation Frames (LoAF) / Total Blocking Time When a browser is "busy" with code, the user experiences micro-delays. These breaks in the interface flow increase the mental effort required to stay focused on the goal.

Considerations for UX Researchers

Most UX Researchers spend their days obsessing over friction. They study user flows, information architecture, and ideal page composition. However, the user’s perception of your site or application starts before anything is shown on screen. There is a silent friction that kills UX faster than a confusing navigation menu: the “waiting for the world to load” tax. Fortunately, web performance specialists have the expertise to identify, prioritize, and fix these issues.

Why web performance specialists are a UXer’s secret weapon

If UX Researchers seek to reduce friction, Web Performance Optimization can be the grease that makes sure the user doesn’t get stuck in the first place.

We are often stereotyped as the “speed geeks” obsessed with meticulously shaving off fractions of a second, but the truth is that the best performance specialists are actually high-empathy advocates. We aren’t just trying to make the site faster; we are trying to make it accessible to everyone, everywhere.

If you’re a UX researcher, and you care about equality, we’re on the same page. We want to make sure that a user in a rural area on a five-year-old Android phone has the same access to information as a user in a tech hub on the latest iPhone. We don’t just provide metrics; we provide visibility into the digital divide.

From technical logs to human stories

Performance specialists can help UX Researchers look past the typical user. In performance, we often look at the 75th, 90th, or 95th percentile (p95) because these are the users who are having the worst experience.

When we show you that your p95 LCP is 7.1 seconds, we aren’t just giving you a number. We are giving you a segment of users who are likely feeling negative emotions toward your product. We can identify these pain points and turn them into inclusion (and revenue) wins, and this small percentage of users can make a big difference.

If 5% of your page views have poor experiences, and your site gets 10M page views a month, that’s 500k users who are likely feeling abandoned. By working together, we can make sure they have a better experience. If I can reduce the time it takes for every user to see the content they want by 0.1s, that’s 11.6 days of collective time saved per month. These small wins can translate into meaningful impact to user experience, revenue, conversions, page views, session depth, and frequency of visit.

Engineering a gentler web

At the end of the day, we both want the same thing: a web that respects the user’s time and dignity.

By combining the qualitative why of UX research with the quantitative what of web performance, we stop seeing users as data points and start seeing them as people. We move from being engineers expressing concerns through math and statistics to being partners building a more inclusive digital world.

The next time you see an engineer worrying about a long task, don’t see it as a technical hurdle. See it as an opportunity for an ally to help you create a more empathetic experience.

I'm Ethan Gardner. I help organizations turn web performance into a competitive advantage and improve developer velocity with design systems. Interested in consulting, audits, or workshops? Reach out.