DEV Community

Cover image for How to Speed Up Your VS Code Extension - Not Only Webpack
Zhe Li
Zhe Li

Posted on • Updated on

How to Speed Up Your VS Code Extension - Not Only Webpack

Introduction

Extensions let users add language, debuggers, and tools to VS Code to support their development workflow. VS Code has rich extensibility model which lets extension access to UI and contribute functionality.

Generally, more than one extension would be installed in VS Code, so as an extension developer, we should always care about performance of the extension to avoid slowing down other extensions or even the main process of VS Code.

Some rules we should follow when develop an extension:

  1. Avoid using sync methods. sync methods would block the entire Node process till they return. Instead, you should use async/await as much as possible. If you find it hard to replace sync methods with async, consider to make a refactoring.

  2. Only require what you need. Some dependencies may be very large, lodash for an example. Usually, we do not need all methods of lodash, and require the entire lodash library doesn't make sense. Every method of lodash has a standalone module, and you can require the part only you need.

  3. Regard activating rule seriously. In most case, your extension may have no need to activate. Do not use * as activating rule. If your extension really needs always to activate to listen some events, consider to execute the main code within a setTimeout to act as a low priority job.

  4. Load modules on demand. Using import ... from ... is a common way to require modules, however, it may be not a good way sometimes. For example, a module called request-promise may cost too much time to load (1 to 2 seconds on my side), but only when some conditions are met we may need to fetch remote resources, such as local cache is expired.

The first three rules mentioned above are followed by many developers. In this article, we will discuss about a way to load modules on demand, which should meet the habits we write TS or JS to import modules, and require as little cost to change existing code as possible.

Load Modules on Demand

Meet the habits

Commonly, we use import statement to load modules on top of the script as code shows below:

import * as os from 'os';

Node loads the specific module synchronously as soon as we import it, and blocks rest code behind.

What we need is a new method, called impor for example, to import module without loading it:

const osModule = impor('os'); // osModule is unaccessible as os module is not loaded yet

To reach this target, we need to use Proxy. The Proxy object is used to define custom behavior for fundamental operations.

We can customize get method to load the module only when it is called.

get: (_, key, reciver) => {
    if (!mod) {
        mod = require(id);
    }
    return Reflect.get(mod, key, reciver);
}

With use of Proxy, osModule would be a Proxy instance, and os module will be loaded only when we call one of its methods.

const osModule = impor('os'); // os module is not loaded
...
const platform = osModule.platform() // os module loads here

import {...} for ... is widely used when we only want to use part of the module. However, it may make Node have to access to the module to check its properties. Thus, getter will be executed and the module is loaded at that time.

Load modules with background job

Load on demand still is not enough. We can move forward a single step
to improve the user experience. Between the extension startup and the module requirement when user executes a command, we have enough time to load the module in advance.

It's an obvious idea to make a job in the background to load modules in a queue.

Timings

We built an extension called Azure IoT Device Workbench, which makes it easy to code, build, deploy and debug IoT project with multiple Azure services and popular IoT development boards.

Because of the big scope of Azure IoT Device Workbench touches, the extension is very heavy to activate. Also, it needs always to activate to listen USB event to take actions when IoT devices connect to the computer.

Activate Time
Figure 1 Activation timing of Azure IoT Device Workbench with lazy load and normal load

We have compared with lazy load and normal load for Azure IoT Device Workbench in different cases. From top to bottom in Figure 1, the charts are for launch without workspace open, non IoT project workspace open, and IoT project workspace open. The left charts are for cold boot, and the right for warm boot. Cold boot only happens when the extension is installed at the first time. After VS Code makes some caches, the extension always launches with warm boot. X-aixs is for time of millisecond, and Y-aixs is for loaded module number.

With normal load, the extension is activated at end of the chart. We find the extension is activated very advanced with lazy load with both cold boot and warm boot, especially when VS Code launches without workspace open.

For cold boot without workspace open, lazy load has ~30x speed to start up, and ~20x speed for warm boot. With non IoT project open, lazy load is ~10x faster then normal load for cold boot, and ~20x faster for warm boot. When VS Code opens an IoT project, Azure IoT Device Workbench needs require amount of modules to load the project, however, we still have ~2x speed with cold boot, and ~3x speed with warm boot.

Here're the complete timing charts for lazy load:

Lazy Load
Figure 2 Complete timing of Azure IoT Device Workbench with lazy load

Same as Figure 1, the charts in Figure 2 are for both cold and warm boot with no workspace open, non IoT project workspace open, and IoT project workspace open.

Load timing stage of modules loaded by background job after activated shows in the charts very clearly. The user can hardly notice this small action, and the extension launches quite smoothly.

To make this performance improvement available to all VS Code extension developers, we have published a Node module called impor and have used it in Azure IoT Device Workbench. You can apply it in your project with very little code change.

Module Bundle

Almost all of the VS Code extensions have Node module dependencies. Because of the way Node module works, the depth of dependency level may be very deep. Other, structure of the module may be complex. And that is what Node module black hole talks about.

To clean up Node modules, we need an awesome tool, webpack.

Webpack is a static module bundler for modern JavaScript applications, such as VS Code extension. When webpack processes the application, it internally builds a dependency graph which maps every module the project needs and generates one or more bundles.

Tree shaking

Tree shaking is a term commonly used in the JavaScript context for dead-code elimination. It relies on the static structure of ES2015 module syntax, i.e. import and export. The name and concept have been popularized by the ES2015 module bundler rollup.

It is very easy to make a tree shaking with webpack. The only thing we need is to specific an entry file and its output name, webpack will handle the rest things.

With tree shaking, untouched files, including JavaScript code, markdown files, and etc, will be removed. Then webpack will merge all code into a single bundled file.

Code splitting

Merging all code into one file is not a good idea. To work with load on demand, we should split the code into different parts, and only load the part we need.

Now, to find a way to split code is another problem we need to solve. A feasible solution is to split every Node module into a single file. It is unacceptable to write every Node module path in webpack configuration file. Fortunately, we can use npm-ls to get all Node modules used in production mode. Then, in output section of webpack configuration, we use [name].js as output to compile every module.

Apply bundled modules

When we ask to load a module, happy-broccoli for example, Node will try to find happy-broccoli.js in node_modules folder. If the file doesn't exist, Node will try to find index.js under happy-broccoli folder in node_modules. If still fail, Node looks for main section in package.json.

To apply the bundled modules, we can put them into node_modules folder in tsc output directory.

If a module is incompatible with webpack bundle, it can be just copied to the output directory node_modules folder.

Here's an example of extension project structure:

|- src
|  |- extension.ts
|
|- out
|  |- node_modules
|  |  |- happy-broccoli.js
|  |  |- incompatible-with-bundle-module
|  |     |- package.json
|  |
|  |- extension.js
|
|- node_modules
|  |- happy-broccoli
|     |- package.json
|
|  |- incompatible-with-bundle-module
|     |- package.json
|
|- package.json
|- webpack.config.js
|- tsconfig.json

Without bundling Node modules, there are 4368 files in Azure IoT Device Workbench, and only 343 files are left after applying bundled modules.

Webpack config example

'use strict';

const cp = require('child_process');
const fs = require('fs-plus');
const path = require('path');

function getEntry() {
  const entry = {};
  const npmListRes = cp.execSync('npm list -only prod -json', {
    encoding: 'utf8'
  });
  const mod = JSON.parse(npmListRes);
  const unbundledModule = ['impor'];
  for (const mod of unbundledModule) {
    const p = 'node_modules/' + mod;
    fs.copySync(p, 'out/node_modules/' + mod);
  }
  const list = getDependeciesFromNpm(mod);
  const moduleList = list.filter((value, index, self) => {
    return self.indexOf(value) === index &&
        unbundledModule.indexOf(value) === -1 &&
        !/^@types\//.test(value);
  });

  for (const mod of moduleList) {
    entry[mod] = './node_modules/' + mod;
  }

  return entry;
}

function getDependeciesFromNpm(mod) {
  let list = [];
  const deps = mod.dependencies;
  if (!deps) {
    return list;
  }
  for (const m of Object.keys(deps)) {
    list.push(m);
    list = list.concat(getDependeciesFromNpm(deps[m]));
  }
  return list;
}

/**@type {import('webpack').Configuration}*/
const config = {
    target: 'node',
    entry: getEntry(),
    output: {
        path: path.resolve(__dirname, 'out/node_modules'),
        filename: '[name].js',
        libraryTarget: "commonjs2",
        devtoolModuleFilenameTemplate: "../[resource-path]",
    },
    resolve: {
        extensions: ['.js']
    }
}

module.exports = config;

Compare with webpack classic solution

Instead of bundling the entire extension, only bundling modules respectively can make a big benefit of packing. It is very possible that the extension throws dozens of errors after webpacked'. Splitting every module into a single file makes it easier to debug. Also, loading specific bundled module on demand will minimize the impact on performance.

Experiment Results

Module bundle is applied to Azure IoT Device Workbench with lazy load to compare with normal load.

module bundle
Figure 3 Activation timing of Azure IoT Device Workbench with lazy load with bundled modules and normal load

Module bundle has decreased activation time sharply. For cold boot, lazy load even costs less time than normal load to load all modules completely in some cases.

Normal Load Webpack Classic Solution* Lazy Load Lazy Load with Bundled Modules**
No workspace, cold boot 19474 ms 1116 ms 599 ms 196 ms
No workspace, warm boot 2713 ms 504 ms 118 ms 38 ms
Non IoT workspace, cold boot 11188 ms 1050 ms 858 ms 218 ms
Non IoT workspace, warm boot 4825 ms 530 ms 272 ms 102 ms
IoT workspace, cold boot 15625 ms 1178 ms 7629 ms 2001 ms
IoT workspace, warm boot 5186 ms 588 ms 1513 ms 517 ms

*,** Some modules required by Azure IoT Device Workbench are incompatible with webpack, and are not bundled.
Table 1 Activation time of Azure IoT Device Workbench in different statuses

Activation time shows in Table 1 is between very beginning of the entry to the extension and the end of activate function:

// start of timing
import * as vscode from 'vscode';
...
export async function activate(context: vscode.ExtensionContext) {
    ...
    // end of timing
}
...

Usually, time before activated is longer then startup time shows in VS Code Running Extensions page. For example, when open IoT workspace with warm boot, the activation time is 517 ms in the table, but the startup time is ~200 ms in VS Code Running Extensions page.

Activation time of classic webpack solution only has relationship to boot mode, because all modules are always loaded in the same way. When applying lazy load on Azure IoT Device Workbench, it starts up further faster without workspace open than with IoT workspace open, no matter whether with or without bundled modules. When we open an IoT workspace, most modules are required, and the benefit taken from lazy load is not obvious, so lazy load with bundled modules has similar activation time with classic webpack solution.

Conclusion

In this article, a load bundled modules on demand method is proposed. A heavy extension called Azure IoT Device Workbench is tested for the method of multiple cases, and its startup speed has been increased to dozens of times. In some cases, this method also shows better performance than classic webpack solution.

Top comments (1)

Collapse
 
likidu profile image
Liki Du

Thanks for the sharing :) Great insights.