DEV Community

Cover image for Performance Tools
Laurie
Laurie

Posted on • Updated on • Originally published at laurieontech.com

Performance Tools

If you've worked on browser-based apps before, you may be familiar with a tool called Lighthouse.

Lighthouse is an auditing tool that gives you a series of "scores" for various metrics, e.g. Accessibility, Performance, SEO. It's available in chrome devtools and can also be run via CLI (command line interface).

In this post we're going to focus on how Lighthouse measures performance and how that differs from other tools.

Lighthouse in devtools

Lighthouse auditing tool in chrome devtools.

Lighthouse runs your site to calculate metrics and judge how performant it is. However, there are different ways to run Lighthouse reports and Lighthouse itself provides different modes!

1 - Devtools throttling (sometimes referred to as request-level throttling)

In this mode, Lighthouse attempts to mimic your site behavior on a slow device. Lighthouse accomplishes this by throttling the connection and cpu, replicating something like a nexus 4g on a slow 4g connection. They do this via the Chrome browser (this is a google tool, so it's only testing on the google browser). While this helps test site performance on a slow device it isn't an exact simulation. That's because this "slowness" is relative to the speed of your local device.

If you're running a high powered Mac with a really strong internet connection it's going to register a better score than running the same simulation using an older mobile device.

2 - Simulated throttling

The aim of this mode is the same as devtools throttling, mimic your site behavior on a slow device/connection. However, Lighthouse runs against a fast device and then calculates what experience a slow device would have. We'll dive into this more in the next section on Page Speed Insights.

3 - Packet-level throttling

In this mode Lighthouse does not throttle and expects that the operating system is doing it. We'll explain this mode more in the section on Webpage Test.

What is interesting about these modes is that depending on which tool you're using to access Lighthouse reports, you may be running a different mode.

By default, running a Lighthouse audit in chrome devtools uses the first mode. Running via the chrome extension uses the second. The CLI version of Lighthouse allows you to pass a flag, throttling-method, to specify which mode you'd like to use. It uses simulated throttling by default.

Page Speed Insights

PSI landing page screenshot.

Page Speed Insights (PSI) is another Google-provided tool. It uses the simulated throttling mentioned above.

PSI uses lab data which means it runs against Google servers instead of being influenced by the specs of your local machine. It gets metrics using a fast device and then artificially calculates what a slow device would experience. This is the fastest way, of the three throttling methods above, to calculate performance metrics.

Why does it matter if it's fast? Well, PSI is run for millions of pages in order to support SEO. We'll talk about that at the end.

But because of this, the calculations need to be fast rather than perfect. So this multiplier is easier than devtools throttling and typically just as accurate or better. Note that it can be worse in certain edge cases.

Another thing about PSI is that, for some sites, you can get a CRUX (Chrome User Experience Report). This is a report that uses real user monitoring (RUM) and bases the page metrics on how real users interact with a page. This is the most accurate type of data and produces the metrics that most directly reflect user experience of performance.

Webpage Tests

Webpage test landing page screenshot.

The last automated performance tool is webpage test. This tool uses packet-level throttling which means it runs performance benchmarks against real hardware in a real location. As a result, it isn't influenced by your local machine the way devtools throttling is.

It simulates the connection, but it does so at the operating system level, making it more accurate. However, it can also introduce more variance.

Why does this matter?

It seems like there are a lot of tools to test performance, but why does this matter? Do milliseconds really make a difference?

Well, Google is an ecosystem. And most of us are familiar with it because of Google search. Ranking highly on google search is important for a lot of websites. Per Google, site performance impacts a site's ranking.

Specifically, site performance and its impact on ranking is based on core web vitals. So we'll talk about that in the next post.

Top comments (8)

Collapse
 
toddhgardner profile image
Todd H. Gardner

Nice work, thanks for writing this Laurie. One important point is that all of the tools you mentioned, except for the CrUX data, are "lab testing" performance. It's giving you an approximation of what user's experience, but not their actual experience that you could gather with field data.

CrUX data is great, but its super aggregated and slow. I've been working on something last year to make field performance data faster and more accessible with Request Metrics, and we've been writing about performance on Devto. I hope you'll check it out!

Collapse
 
laurieontech profile image
Laurie

Absolutely! And so excited to read this!

Collapse
 
michaelcurrin profile image
Michael Currin

Thanks. I added the WebPageTest one to my list. I'd like to call out window.performance as way of getting timings on the browser out the box.

michaelcurrin.github.io/dev-resour...

I have resources around Lighthouse tests here both manual and cloud automated

michaelcurrin.github.io/dev-resour...

Some screenshots of each tool would be nice in your post. I'll think about adding you my own site too.

Collapse
 
laurieontech profile image
Laurie

That’s a good idea! I’d originally planned to do screenshots and forgotten.

Collapse
 
grahamthedev profile image
GrahamTheDev • Edited

For clarity Page Speed Insights is just a web interface for Lighthouse so you get very similar data from both at default settings (the "lab data" is powered by Lighthouse).

The source code for Lighthouse is on Github which is how you can run the Command Line Interface as Laurie said in the article.

Fairly easy to install locally even if you are a complete noob when it comes to node like I was.

See this stack overflow answer I gave for more info on the benefits of the CLI over other methods:

An alternative way to run Lighthouse

Although this is an old question there is an alternative way to run Lighthouse (the engine behind Page Speed Insights) locally that may be useful to people in some circumstances.

You can install the Lighthouse Command Line Interface (CLI) locally on your machine quite…

Finally you really should be collecting your own Real User Metrics (RUM) data, as such I would recommend the Web Vitals library from Google if you don't want to build your own solution, you can pipe it to your Google Analytics to make life easy and keep all your data together too!

p.s. any questions on Page Speed Insights, Lighthouse or Web Vitals just ask, I know a reasonable amount about it as you can see from the all time rankings below on SO (I am Graham Ritchie for reference πŸ˜‹):

page speed insights ranking position for the tag "pagespeed-insights" on stack overflow, showing I am top of all time

Collapse
 
laurieontech profile image
Laurie

Yup! This post talks about the CLI and the various flags.

Collapse
 
grahamthedev profile image
GrahamTheDev

Thanks, I rephrased it to share the link to the CLI and reference that you had it in your post, not sure how I missed your references twice!

Great article BTW!