DEV Community

Rui Wu
Rui Wu

Posted on

Make your Vite applications run a little faster

make-your-vite-applications-run-a-little-faster

Background

While Vite runs fast by default, performance issues can creep in as your project's needs grow. This article hopes to present different optimizations from multiple perspectives to make your Vite application run a little bit faster, solving problems such as slow server startups, slow page loads, and slow builds.

Browser

Extensions

For the most part, we view our Vite applications in a browser and develop debugging. Some browser extensions (e.g. de-advertising extensions uBlock, adGuard) may interfere with the requests, resulting in slower startups and refreshes. In this case, it is recommended to use a plugin-free configuration dedicated to development, and it is also recommended to use Traceless Mode to get faster speeds.

Developer Tools

In addition, the Vite Development Server has a strong cache of pre-packaged dependencies and a fast 304 response to source code, which may significantly affect startup speed and full page refresh time if the cache is disabled while the Developer Tools are open. Therefore, it is recommended to turn off the "Disable Cache" feature to ensure faster startup and full page refresh.

disable-cache

Vite configuration

Plugins

Official Vite plugins have been optimizing performance, such as vitejs/vite-plugin-react which reduces Node.js startup time by dynamically importing large dependencies.

Community plugins may not be as concerned about performance, which in turn affects the developer experience.

As an example, vite-plugin-eslint is still recommended in many community articles, but as of today it hasn't been updated in two years, and support for newer versions of Vite and ESLint is mediocre.

vite-plugin-eslint-2-years-ago-updated

On the other hand, it forces ESLint checksums to be run in the buildStart and transform hooks, where checksums in buildStart can lead to long waits during the startup of the development server, delaying the time when the site can be accessed in the browser, and checksums in the transform hooks can lead to some files loading slower than others, delaying the time when the site is loaded in the browser. The slower some files load than others, the more noticeable the request waterfall in the browser when loading the site, and the more noticeable the lag that can be felt during development.

Although the purpose of ESLint validation is achieved, the development experience is degraded and the development speed is reduced, is it worth it? This is a question worth considering.

There are many ways to identify this performance issue, such as using vite --debug plugin-transform or vite-plugin-inspect to Alternatively, you can visit the site after running vite --profile, press p + enter in the terminal to record a .cpuprofile, and then use a tool like speedscope to examine the configuration file and identify bottlenecks.

Once you've identified a performance issue, you can consider forking and improving the appropriate plugin, or just using a replacement. For the example above, you might consider replacing it with vite-plugin-checker, @nabla/vite-plugin-eslint or vite-plugin-eslint2, all of which support asynchronous ESLint checksums so as not to particularly impact performance and development experience, while still serving the purpose of ESLint checksums.

Interests: I am the author of vite-plugin-eslint2, welcome to communicate and discuss.

Toolchain

Streamlining your toolchain is a great way to speed things up.

As an example, many people use SCSS primarily for variables and nesting, but in reality CSS variables and nesting have both officially landed.

css-variables

css-nesting

If the production environment is going to support slightly older versions of browsers that don't support nesting, it's entirely possible to use PostCSS to handle nesting in production builds so that it doesn't impact the speed of the Vite development server. Very, very old versions of browsers are neither secure nor fast and have a poor experience, and should be gradually guided to phase out this part of the process to improve the user experience and the development experience, check out browser-update if you're interested.

Using a more raw toolchain is also a good way to speed things up; the SWC website shows it to be 20 to 70 times faster than Babel, and there are tons of speed advantages in complex real-world applications, which proves that rawness can be a big help in speeding things up. Instead of vite-plugin-react, you can use @vitejs/plugin-react-swc, with LightningCSS instead of PostCSS, SWC or esbuild instead of Babel, etc. etc. to achieve better performance.

swc-speed

lightningcss-speed

Dependency pre-bundling

Vite converts dependencies provided as CommonJS or UMD into ES modules, and also converts ESM dependencies with many internal modules into a single module - this is Vite's dependency pre-bundling.

The former is for compatibility reasons, which we won't discuss in detail here. The latter is for performance reasons; without this step, when we execute import { debounce } from 'lodash-es', the browser will make more than 600 HTTP requests at the same time, corresponding to more than 600 modules of lodash-es! Even though the server can handle them easily, the large number of requests causes network congestion on the browser side and makes the page load significantly slower.

This is why we need to rely on pre-bundling dependencies; Vite itself does this automatically and transparently, and we can explicitly specify which dependencies need to be pre-bundled via configuration.

// vite.config.ts
import { defineConfig } from 'vite';
import pkg from './package.json';

export default defineConfig({
  optimizeDeps: {
    include: Object.keys(pkg.dependencies),
  },
});
Enter fullscreen mode Exit fullscreen mode

Warming up

Warming up is an often overlooked optimization, and one that I've barely seen in some of the articles that have been written about Vite, but it can be an effective way to improve the development experience and reduce wait times.

By default, the Vite Development Server only converts files requested by the browser on demand, which allows it to start up quickly and only perform conversions on the files it uses.

However, this practice can leave the Vite Development Server idle, resulting in a wait for switching pages to load, leading to a poor development experience. If certain files are expected to be requested for a short period of time, they can be converted and cached in advance to improve page load speed. This practice is known as warming up.

Frequently used files can be found by running vite --debug transform and examining the logs, then adding it inside the Vite configuration.

vite:transform 28.72ms /@vite/client +1ms
vite:transform 62.95ms /src/components/BigComponent.vue +1ms
vite:transform 102.54ms /src/utils/big-utils.js +1ms
Enter fullscreen mode Exit fullscreen mode
// vite.config.ts
import { defineConfig } from 'vite';

export default defineConfig({
  server: {
    warmup: {
      clientFiles: [
        './src/components/BigComponent.vue',
        './src/utils/big-utils.js',
      ],
    },
  },
});
Enter fullscreen mode Exit fullscreen mode

Note that only frequently used files should be warmed up, this is to avoid overloading the Vite development server. However, I still prefer to just warm up them all straight away. 🤪

The latest version of Nuxt has this feature configured, you can also check Vite / Configuration / server.warmup for more information.

For 4.3 <= Vite < 5, vite-plugin-warmup is required, there is no support for lower versions.

Project

Path resolution

It may be difficult to predict how expensive it is to parse imported paths.

When you try to use import './Component' to import ./Component.jsx, Vite will run the following steps to parse it:

  • Check if ./Component exists, it does not.
  • Check for ./Component.mjs exists and does not exist.
  • Checks if ./Component.js exists and does not exist.
  • Check ./Component.mts exists, does not exist.
  • Checks if ./Component.ts exists and does not exist.
  • Checks if ./Component.jsx exists, it does!

Parsing a simple imported path takes 6 filesystem checks! The more implicit imports there are, the more time it takes to resolve the path.

Therefore, it's usually better to explicitly import the path, e.g. import './Component.jsx', you can also narrow down the list of resolve.extensions to minimize the general filesystem checking but must make sure that it also applies to files in node_modules.

If you want to use ESLint to force explicit import paths, you can configure import/extensions or import-x/extensions rule, as shown below.

"import/extensions": ["warn", "ignorePackages"]
"import-x/extensions": ["warn", "ignorePackages"]
Enter fullscreen mode Exit fullscreen mode

Barrel files

Barrel files are files that re-export the APIs of other files in the same directory. For example:

// src/utils/index.js
export * from './color.js';
export * from './dom.js';
export * from './slash.js';
Enter fullscreen mode Exit fullscreen mode

When you only import a single API, such as import { slash } from '. /utils', you still need to fetch and convert all the files in the bucket file, as they may contain the slash API and may also contain other side effects that run at initialization time. This means that on initial page load, you are loading more files than needed, which in turn leads to slower page loads. As there are more and more modules, the loading time increases, and tests have shown that loading time shows a near-exponential or quasi-exponential increase with the number of modules.

js-tools-module-cost

If possible, you should try to avoid using barrel files and import individual APIs directly. you can read Speeding up the JavaScript ecosystem - The barrel file debacle for more details and data, the image above is from this article.

Summary

This article was written with reference to Vite Official Documentation / Performance, and I would like to express my heartfelt thanks to the Vite team. 🙏

This article describes different optimization approaches from three aspects: browser, Vite configuration, and project, with the goal of making Vite apps run a little bit faster and solving problems such as slow server startups, slow page loads, and slow builds.

However, it's important to note that these methods are not a panacea and they won't affect performance in a production environment. To optimize the production environment performance, you need to consider splitting the code, optimizing static resources, introducing SSR, introducing CDN, preloading, prefetching, optimizing CSS, caching, Web Worker, reducing DOM operations and so on, these will not be here to expand in detail, interested in leaving a message to ask me to write about it.

It is worth mentioning that Vite is not everything, as Farm documentation said, inconsistencies between development and production, unpacking optimization difficulties and other issues still need to be resolved, all of which require us to wait patiently, or actively participate in it. If you can't wait any longer, we recommend checking out Farm or Rsbuild.

I hope this article can bring you some inspiration and help!

Top comments (0)