DEV Community


Best Practice Tips for Writing Modern NodeJS

I like to make fun web things from scratch. Ideally build-less, framework-less, infrastructure-less and free from the annoyances of my day job.
・6 min read

Note to future readers this is circa Node version 15.

NodeJS has been around for a while but recently the push for standardizing with browsers and an explosion of new web API means the way we write JS is changing. Unfortunately, not all of the resources out there have caught up so here's a couple of tips to making you node code fresh and up-to-date. Note that this won't be a post about build tools and libraries but rather native functionality.

Use ESM modules

This is as easy as setting type=module in your package.json. This allows you to use ESM, which has a number of advantages and lets you more easily share code with the browser and modern bundlers with less overhead.

"But does that mean I can still produce a CJS package for downstream clients?" Yes, but it's a little more involved. There are 2 fields that indicate the entrypoint to your package main and module. main will be used by clients that want CJS (eg using require), module will be used by clients that want ESM. By defining both you can have a module that exports both. A common way to do this is to write the module in ESM and then use a tool like rollup to export two bundles:

export default {
    input: "src/foo.js",
    output: [
            dir: "dist",
            format: "esm",
            entryFileNames: "[name].esm.js",
            dir: "dist",
            format: "cjs",
            entryFileNames: "[name].js",
Enter fullscreen mode Exit fullscreen mode

Then you can run rollup -c. Then set the entrypoints main and module:

  "name": "multi-module",
  "version": "1.0.0",
  "description": "",
  "main": "dist/foo.js",
  "module": "dist/foo.esm.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  "author": "",
  "license": "ISC",
  "devDependencies": {
    "rollup": "^2.33.1"
Enter fullscreen mode Exit fullscreen mode

Now downstream users can choose which type of module they prefer.

If you choose not to use the bundler approach you may need to duplicate work and figure out how to get your dependencies to work in each type of module. This can be especially hard if you are doing weird things with the module cache (eg re-routing modules for test stubs) as ESM modules do not let you have access to it. If you are starting a new project and only pick only one, I'd still suggest ESM for it's ease to convert.

You can use ESM when dealing with the node standard library too:

import { join } from "path";
Enter fullscreen mode Exit fullscreen mode

Newer versions of node even allow you to directly use CJS modules via imports which is extremely nice as there are still tons of packages that haven't converted yet. There are some limitations though, packages using dynamic require will not work but try it first and see how it goes.

When you can't use ESM

There's still a way to get those packages working! NodeJS provides a module called module (yeah you'll probably forget because it's so generic). However, this has some useful stuff namely createRequire. createRequire is a factory function that creates the require function you used previously for CJS but it works inside of ESM modules. You can use it like this:

import { createRequire } from "module";
const require = createRequire(import.meta.url);

const react = require("react");
Enter fullscreen mode Exit fullscreen mode

Use import.meta.url to get the current file path

As you saw above we needed to pass in import.meta.url to the createRequire function. So what is this? import is a special keyword that is used to get dynamic imports but it also has a property called meta that gets info about the current module. As of this writing the only thing there is the url the module came from. This is not a special thing in node, it works in browsers too, unlike global variables like __dirname. createRequire needs the path of the current module and is setup so you can easily use the value import.meta.url directly. This is also how you can get __dirname and __filename from an ESM module.

import { dirname } from "path";
import { fileURLToPath } from "url"; 

const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename);
Enter fullscreen mode Exit fullscreen mode

Keep your paths as URLs

Another new thing above was fileURLToPath. import.meta.url gives you a URL not a string. URL is an object for building and manipulating URLs and is standard in browsers. This is the a good way to handle paths in a platform agnostic way since *NIX and Windows aren't friends when it comes to paths, but everyone understands URLs! The problem is that many legacy node APIs are still dependent on strings for paths, such as the APIs found in fs. In order to get around this there's the url module which has some handy functions to convert between path strings and URLs.

import { fileURLToPath, pathToFileURL } from "url"; 
Enter fullscreen mode Exit fullscreen mode

If you are curious what happens when you toString() a URL:


They are file URLs. You can even add query strings though I don't know what purpose this would serve for local files. They also have a neat trick in the constructor:

const myFile = new URL("path/to/file.ext", import.meta.url);
Enter fullscreen mode Exit fullscreen mode

This constructs the path off of the base url, which is handy for getting relative paths. Just remember you need a URL for the base, not a string.

Use async fs

fs is super useful but writing callbacks is painful. Instead use the async API! This can be done like so:

import { promises as fs } from "fs";
Enter fullscreen mode Exit fullscreen mode

The import is a little weird because the fs namespace has a property called promises which contains the whole promisified API. This also means you can't just grab single methods either. Still it's much cleaner to write:

import { promises as fs } from "fs";
const await fs.readFile("my-file.ext", "utf-8");
Enter fullscreen mode Exit fullscreen mode

Note that you should always use async versions for performance and for possible browser polyfills. It's so close to writing readFileSync that you should never, ever need to use that.

Top-level await

Before, if you wanted to use something async, you used to need to wrap things in an async IIFE:

(async () => {
   const result = await asyncFunction();
Enter fullscreen mode Exit fullscreen mode

Modern node has top-level await meaning even things at the top level can be awaited. No more boilerplate!

Be aware that this only works for ESM modules.

Dealing with JSON

JSON is a very useful format, node recognized this but unfortunately things are a little weird. You might have just used plain-old require to import it:

const config = require("config.json");
Enter fullscreen mode Exit fullscreen mode

This doesn't work with ESM. Node does have an --experimental-json-modules flag you can enable. This lets you do things like:

import config from "config.json";
Enter fullscreen mode Exit fullscreen mode

It's limited in that only the default can be imported, you can't pull out individual keys. I'd also suggest not using it at this time as it's unlikely to line up with the browser implementations and will probably change a little.

You can use createRequire as shown above and use that to import JSON the way you always have but as with all things require the main problem stems from the fact that it's synchronous.

What I typically do though is to manually import it properly and asynchronously.

import { promises as fs } from "fs";
function readJson(path){
   return fs.readFile(path, "utf-8").then(x => JSON.parse(x));
function writeJson(path, obj){
   return fs.writeFile(path, JSON.stringify(obj));
Enter fullscreen mode Exit fullscreen mode

These small utilities are useful to have around. The main benefit is if we are in an environment like a browser that wants to polyfill fs (maybe using the file picker API?) then it needs to be async to do so properly.

Try Deno

If you are looking to write modern and browser standard JS on the backend and terminal also take a look at Deno. It's very similar to node but lacks a lot of the legacy node-specific features and adds a few new ones based on browser APIs. You might find that most of Deno's APIs look like what you would expect from future node code including standard ESM, promises, native fetch etc. In fact, even if you don't want to use Deno yourself, by writing modern code you increase the chances that your code will also work for other using Deno with less fuss.

Discussion (0)