Note this post is circa Node version 15. Content has not been updated. There have been several changes/improvements and I've learned quite a bit more since then.
NodeJS has been around for a while but recently the push for standardizing with browsers and an explosion of new web API means the way we write JS is changing. Unfortunately, not all of the resources out there have caught up so here's a couple of tips to making you node code fresh and up-to-date. Note that this won't be a post about build tools and libraries but rather native functionality.
Use ESM modules
This is as easy as setting type=module
in your package.json. This allows you to use ESM, which has a number of advantages and lets you more easily share code with the browser and modern bundlers with less overhead.
"But does that mean I can still produce a CJS package for downstream clients?" Yes, but it's a little more involved. There are 2 fields that indicate the entrypoint to your package main
and module
. main
will be used by clients that want CJS (eg using require), module
will be used by clients that want ESM. By defining both you can have a module that exports both. A common way to do this is to write the module in ESM and then use a tool like rollup to export two bundles:
//rollup.config.js
export default {
input: "src/foo.js",
output: [
{
dir: "dist",
format: "esm",
entryFileNames: "[name].esm.js",
},
{
dir: "dist",
format: "cjs",
entryFileNames: "[name].js",
}
]
}
Then you can run rollup -c
. Then set the entrypoints main
and module
:
//package.json
{
"name": "multi-module",
"version": "1.0.0",
"description": "",
"main": "dist/foo.js",
"module": "dist/foo.esm.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "",
"license": "ISC",
"devDependencies": {
"rollup": "^2.33.1"
}
}
Now downstream users can choose which type of module they prefer.
If you choose not to use the bundler approach you may need to duplicate work and figure out how to get your dependencies to work in each type of module. This can be especially hard if you are doing weird things with the module cache (eg re-routing modules for test stubs) as ESM modules do not let you have access to it. If you are starting a new project and only pick only one, I'd still suggest ESM for it's ease to convert.
You can use ESM when dealing with the node standard library too:
import { join } from "path";
Newer versions of node even allow you to directly use CJS modules via imports which is extremely nice as there are still tons of packages that haven't converted yet. There are some limitations though, packages using dynamic require will not work but try it first and see how it goes.
When you can't use ESM
There's still a way to get those packages working! NodeJS provides a module called module
(yeah you'll probably forget because it's so generic). However, this has some useful stuff namely createRequire
. createRequire
is a factory function that creates the require
function you used previously for CJS but it works inside of ESM modules. You can use it like this:
import { createRequire } from "module";
const require = createRequire(import.meta.url);
const react = require("react");
Use import.meta.url to get the current file path
As you saw above we needed to pass in import.meta.url
to the createRequire
function. So what is this? import
is a special keyword that is used to get dynamic imports but it also has a property called meta
that gets info about the current module. As of this writing the only thing there is the url the module came from. This is not a special thing in node, it works in browsers too, unlike global variables like __dirname
. createRequire
needs the path of the current module and is setup so you can easily use the value import.meta.url
directly. This is also how you can get __dirname
and __filename
from an ESM module.
import { dirname } from "path";
import { fileURLToPath } from "url";
const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename);
Keep your paths as URLs
Another new thing above was fileURLToPath
. import.meta.url
gives you a URL not a string. URL
is an object for building and manipulating URLs and is standard in browsers. This is the a good way to handle paths in a platform agnostic way since *NIX and Windows aren't friends when it comes to paths, but everyone understands URLs! The problem is that many legacy node APIs are still dependent on strings for paths, such as the APIs found in fs
. In order to get around this there's the url
module which has some handy functions to convert between path strings and URLs.
import { fileURLToPath, pathToFileURL } from "url";
If you are curious what happens when you toString()
a URL:
file:///User/ndesmic/path/to/file.ext
They are file URLs. You can even add query strings though I don't know what purpose this would serve for local files. They also have a neat trick in the constructor:
const myFile = new URL("path/to/file.ext", import.meta.url);
This constructs the path off of the base url, which is handy for getting relative paths. Just remember you need a URL for the base, not a string.
Use async fs
fs
is super useful but writing callbacks is painful. Instead use the async API! This can be done like so:
import { promises as fs } from "fs";
The import is a little weird because the fs
namespace has a property called promises
which contains the whole promisified API. This also means you can't just grab single methods either. Still it's much cleaner to write:
import { promises as fs } from "fs";
const await fs.readFile("my-file.ext", "utf-8");
Note that you should always use async versions for performance and for possible browser polyfills. It's so close to writing readFileSync
that you should never, ever need to use that.
Top-level await
Before, if you wanted to use something async, you used to need to wrap things in an async IIFE:
(async () => {
const result = await asyncFunction();
})();
Modern node has top-level await meaning even things at the top level can be awaited. No more boilerplate!
Be aware that this only works for ESM modules.
Dealing with JSON
JSON is a very useful format, node recognized this but unfortunately things are a little weird. You might have just used plain-old require
to import it:
const config = require("config.json");
This doesn't work with ESM. Node does have an --experimental-json-modules
flag you can enable. This lets you do things like:
import config from "config.json";
It's limited in that only the default
can be imported, you can't pull out individual keys. I'd also suggest not using it at this time as it's unlikely to line up with the browser implementations and will probably change a little.
You can use createRequire
as shown above and use that to import JSON the way you always have but as with all things require
the main problem stems from the fact that it's synchronous.
What I typically do though is to manually import it properly and asynchronously.
import { promises as fs } from "fs";
function readJson(path){
return fs.readFile(path, "utf-8").then(x => JSON.parse(x));
}
function writeJson(path, obj){
return fs.writeFile(path, JSON.stringify(obj));
}
These small utilities are useful to have around. The main benefit is if we are in an environment like a browser that wants to polyfill fs
(maybe using the file picker API?) then it needs to be async to do so properly.
Try Deno
If you are looking to write modern and browser standard JS on the backend and terminal also take a look at Deno. It's very similar to node but lacks a lot of the legacy node-specific features and adds a few new ones based on browser APIs. You might find that most of Deno's APIs look like what you would expect from future node code including standard ESM, promises, native fetch etc. In fact, even if you don't want to use Deno yourself, by writing modern code you increase the chances that your code will also work for other using Deno with less fuss.
Top comments (1)
Hey!
I found this article really useful, and it came up when I searched "how to write modern node".
Some great tips (love the promises as fs!) and thanks for explaining the details like import.meta.url, which I've seen in Vue / Vite but wondered what kind of magic it was.
Have read it top to bottom and bookmarking for later!