DEV Community

Cover image for Build TypeSafe Node API using tRPC, Fastify, Kysely and Atlas CLI
Francisco Mendes
Francisco Mendes

Posted on

Build TypeSafe Node API using tRPC, Fastify, Kysely and Atlas CLI

Introduction

In today's article we are going to create a fully typesafe CRUD API, we are not only going to address the development environment but also the production environment, using some tooling tools to help in the build process, lint, format, among others.

The idea is that at the end of the article you have a base that you can easily extend, adding more procedures and not have to worry about other configurations.

Prerequisites

Before going further, you need:

In addition, you are expected to have basic knowledge of these technologies.

Getting Started

API Setup

Our first step will be to create the project folder:

mkdir api
cd api
Enter fullscreen mode Exit fullscreen mode

Then let's start a new project:

yarn init -y
Enter fullscreen mode Exit fullscreen mode

Now we need to install the base development dependencies:

yarn add -D @types/node typescript
Enter fullscreen mode Exit fullscreen mode

Now let's create the following tsconfig.json:

{
  "compilerOptions": {
    "target": "esnext",
    "module": "CommonJS",
    "allowJs": true,
    "removeComments": true,
    "resolveJsonModule": true,
    "typeRoots": ["./node_modules/@types"],
    "sourceMap": true,
    "outDir": "dist",
    "strict": true,
    "lib": ["esnext"],
    "baseUrl": ".",
    "forceConsistentCasingInFileNames": true,
    "esModuleInterop": true,
    "emitDecoratorMetadata": true,
    "experimentalDecorators": true,
    "moduleResolution": "Node",
    "skipLibCheck": true
  },
  "include": ["src/**/*"],
  "exclude": ["node_modules"]
}
Enter fullscreen mode Exit fullscreen mode

With TypeScript configured, we can install the tooling dependencies:

yarn add -D tsx tsup rome
Enter fullscreen mode Exit fullscreen mode

Let's initialize the rome configuration:

yarn rome init
Enter fullscreen mode Exit fullscreen mode

After running the init command, let's make the following changes to rome.json:

{
  "$schema": "./node_modules/rome/configuration_schema.json",
  "linter": {
    "enabled": true,
    "rules": {
      "recommended": true
    }
  },
  "formatter": {
    "enabled": true,
    "formatWithErrors": false,
    "indentStyle": "space",
    "indentSize": 2,
    "lineWidth": 80,
    "ignore": []
  }
}
Enter fullscreen mode Exit fullscreen mode

Now in package.json let's add the following scripts:

{
  "scripts": {
    "dev": "tsx watch src/main.ts",
    "build": "tsup src",
    "lint": "rome check src --apply",
    "format": "rome format src --write",
    "start": "node dist/main.js"
  },
}
Enter fullscreen mode Exit fullscreen mode

Database Setup

Inside our project directory, let's create a folder called schema/:

mkdir schema
cd schema
Enter fullscreen mode Exit fullscreen mode

Hoping you have a PostgreSQL database running, let's run the following command to inspect the database:

atlas schema inspect -u "postgres://docker:docker@localhost:5432/whale?sslmode=disable" > schema.hcl
Enter fullscreen mode Exit fullscreen mode

After inspecting the database, the above command will create a file called schema.hcl, to which we will then add the schema of our tables:

# @/schema/schema.hcl
schema "public" {
}

table "dogs" {
  schema = schema.public
  column "id" {
    null = false
    type = uuid
    default = sql("gen_random_uuid()")
  }
  column "name" {
    null = false
    type = varchar(100)
  }
  column "isGoodBoy" {
    null = false
    type = boolean
  }
  column "breed" {
    null = false
    type = varchar(100)
  }
  primary_key {
    columns = [column.id]
  }
}
Enter fullscreen mode Exit fullscreen mode

With the database schema defined, we need to apply the migrations to the database by running the following command:

atlas schema apply \
  -u "postgres://docker:docker@localhost:5432/whale?sslmode=disable" \
  --to file://schema.hcl
Enter fullscreen mode Exit fullscreen mode

After confirming that we want to apply the migrations, we can move on to the next step.

Build Database Connector

First, let's install the following dependencies:

yarn add kysely pg
yarn add -D kysely-codegen @types/pg
Enter fullscreen mode Exit fullscreen mode

Then let's create an .env with a variable with the connection string to the database:

DATABASE_URL=postgres://docker:docker@localhost:5432/whale?sslmode=disable
Enter fullscreen mode Exit fullscreen mode

Again in package.json let's add a new script:

{
  "scripts": {
    // ...
    "generate": "kysely-codegen"
  },
}
Enter fullscreen mode Exit fullscreen mode

And run the following command:

yarn generate
Enter fullscreen mode Exit fullscreen mode

The above command will generate the data types inside the node_modules/ folder taking into account the database schema.

Now creating the src/ folder and inside it the db/ folder, let's create our database connector:

// @/src/db/index.ts
import { Kysely, PostgresDialect } from "kysely";
import { DB } from "kysely-codegen";
import { Pool } from "pg";

import { env } from "../env";

export const db = new Kysely<DB>({
  dialect: new PostgresDialect({
    pool: new Pool({
      connectionString: env.DATABASE_URL,
    }),
  }),
});
Enter fullscreen mode Exit fullscreen mode

In the code snippet above we imported the env variable, but it has not yet been created and for that very reason we can move on to the next step.

Build API

First, let's install the remaining dependencies:

yarn add fastify @fastify/cors envalid zod @trpc/server
Enter fullscreen mode Exit fullscreen mode

Now let's set some API defaults by creating the env.ts file:

// @/src/env.ts
import { cleanEnv, str, num } from "envalid";

export const env = cleanEnv(process.env, {
  PORT: num({
    default: 3333,
  }),
  DATABASE_URL: str({
    default: "postgres://docker:docker@localhost:5432/whale?sslmode=disable",
  }),
});
Enter fullscreen mode Exit fullscreen mode

Next, let's define the tRPC context, in which we'll return the request and response objects, as well as the database connector instance:

// @/src/context.ts
import { inferAsyncReturnType } from "@trpc/server";
import { CreateFastifyContextOptions } from "@trpc/server/adapters/fastify";

import { db } from "./db";

export const createContext = ({ req, res }: CreateFastifyContextOptions) => {
  return {
    req,
    res,
    db,
  };
};

export type Context = inferAsyncReturnType<typeof createContext>;
Enter fullscreen mode Exit fullscreen mode

Now we can go define the router and create the API CRUD:

// @/src/router.ts
import { initTRPC } from "@trpc/server";
import { z } from "zod";

import { Context } from "./context";

export const t = initTRPC.context<Context>().create();

export const appRouter = t.router({
  getDogs: t.procedure.query(async ({ ctx }) => {
    return await ctx.db.selectFrom("dogs").selectAll().execute();
  }),
  getDogById: t.procedure
    .input(
      z.object({
        id: z.string().uuid(),
      }),
    )
    .query(async ({ input, ctx }) => {
      return await ctx.db
        .selectFrom("dogs")
        .selectAll()
        .where("id", "=", input.id)
        .executeTakeFirstOrThrow();
    }),
  createDog: t.procedure
    .input(
      z.object({
        name: z.string(),
        breed: z.string(),
        isGoodBoy: z.boolean(),
      }),
    )
    .mutation(async ({ input, ctx }) => {
      return await ctx.db
        .insertInto("dogs")
        .values(input)
        .returningAll()
        .executeTakeFirstOrThrow();
    }),
  updateDog: t.procedure
    .input(
      z.object({
        name: z.string(),
        breed: z.string(),
        isGoodBoy: z.boolean(),
      }),
    )
    .mutation(async ({ input, ctx }) => {
      return await ctx.db
        .insertInto("dogs")
        .values(input)
        .onConflict((oc) => oc.column("id").doUpdateSet(input))
        .returningAll()
        .executeTakeFirstOrThrow();
    }),
  removeDog: t.procedure
    .input(
      z.object({
        id: z.string().uuid(),
      }),
    )
    .mutation(async ({ input, ctx }) => {
      return await ctx.db
        .deleteFrom("dogs")
        .where("id", "=", input.id)
        .returningAll()
        .executeTakeFirstOrThrow();
    }),
});

export type AppRouter = typeof appRouter;
Enter fullscreen mode Exit fullscreen mode

Last but not least, we have to create the entry file, where we are going to setup the http server, among other things:

// @/src/main.ts
import fastify from "fastify";
import cors from "@fastify/cors";
import { fastifyTRPCPlugin } from "@trpc/server/adapters/fastify";

import { appRouter } from "./router";
import { createContext } from "./context";
import { env } from "./env";

(async () => {
  try {
    const server = await fastify({
      maxParamLength: 5000,
    });

    await server.register(cors, {
      origin: "http://localhost:5173",
    });

    await server.register(fastifyTRPCPlugin, {
      prefix: "/trpc",
      trpcOptions: {
        router: appRouter,
        createContext,
      },
    });

    await server.listen({
      port: env.PORT,
    });
  } catch (err) {
    console.error(err);
    process.exit(1);
  }
})();
Enter fullscreen mode Exit fullscreen mode

If you are using monorepo, yarn link or other methods, you can go to package.json and add the following key:

{
  "main": "src/router"
}
Enter fullscreen mode Exit fullscreen mode

This way, when importing the router data types to the trpc client, it goes directly to the router.

Conclusion

I hope you found this article helpful, whether you're using the information in an existing project or just giving it a try for fun.

Please let me know if you notice any mistakes in the article by leaving a comment. And, if you'd like to see the source code for this article, you can find it on the github repository linked below.

Github Repo

Top comments (15)

 
clay profile image
Clay Ferguson

If you take a very large project in JS and convert it to TS you'll discover that MOST of the bugs that come out of the woodwork are simple typos that can be caught by TypeScript at compile time. Most bugs are simple misspellings, wrong arguments passed, wrong type passed, wrong properties on an object, and dumb things like that.

Also when I code all day TS will catch like 10 typos per day, and everything else just always works. I have 30yrs exp so frankly I don't make many mistakes, but the ones I do make are like 99% of the sort that a typesafe language can catch at compile time...so I VERY RARELY even need to debug code or troubleshoot, because everything always just works the first time I run it.

Thread Thread
 
franciscomendes10866 profile image
Francisco Mendes • Edited

I understand.

I think it depends a lot on the approach, if we take into account the context of the article, the fact that we have inference and what we have in the backend is reflected in the frontend, I feel more confident with the data types.

While if the backend and frontend have different languages, I would look to other technologies like GraphQL to handle these issues. Otherwise, from my own experience, I would prefer it to be mandatory to have multiple JSON Schema definitions to be reused in the frontend and backend, this to ensure that most of the problems that could go to production happened during development.

Having static data type safety is ideal for development, validation at the runtime level helps ensure that the frontend and backend teams are on the same page while at the level of transpiling the code from TS to JS helps ensure that the code goes without any inconsistencies.

At the end of the day, everything is important, everything is a way to fix problems, but depending on the time they are used, they may or may not be more important. It's all a matter of perspective.

Thank you for sharing your perspective 💪

Thread Thread
 
clay profile image
Clay Ferguson

The goal I've successfully achieved in my own project using TypeScript, is that if I open some arbitrary source file, pick some arbitrary thing to "break" by making a typo in a variable, parameter, class, property, argument list, etc., the compiler will catch it immediately 99.9% of the time.

There's no possible way someone can claim that waiting until that line of code happens to execute (whether during development or production) is anywhere near as good as the compile-time detection. I've had lots of people start that debate with me, however. :)

Thread Thread
 
franciscomendes10866 profile image
Francisco Mendes • Edited

Interesting. But how could you guarantee that a property from the response object of an http request that is expected to return an array, empty or filled, suddenly returns null, without any prior communication from the backend team? How could I prevent this without executing this line of code?

I think that everything that is client state is easy to manage, but when it comes to something external like the data that comes from an api which we have no control, it ends up being necessary to have a small "shield".

I believe that the simplest solution would be to always deal with null values in every component, but even so, imagine having a property that was removed from a query and we didn't know about it and it ends up remaining in the code without any use.

As with everything in life, it depends on the use case.

Thread Thread
 
clay profile image
Clay Ferguson • Edited

I've found the best practice for avoiding unexpected "types" (i.e. anything wrong with format of results send down to a browser) is to use a code generator. In my particular case the backend is Java so I'm using "cz.habarta.typescript-generator" which generates TypeScript POJOs (from Java POJOs) that I can use in client side (browser), and since I compile my front-end and back-end into the same deployable spring-boot-based docker fat-JAR, there is, once again NEVER any risk that I have any typing issues even as it pertains to the type of problem you mentioned.

For systems where the back&front end can't be compiled at once, the best practice in that case is a version number approach, where the front end can detect a mismatch in type definitions at runtime, by noticing that the version on the object being sent back is newer or older than what the client is expecting and throw the error that says this explicitly rather than trying to contend with the chaos of just some property being null or missing.

Thread Thread
 
franciscomendes10866 profile image
Francisco Mendes

Now that I read the messages more carefully, I think we have similar ideas. Again, thanks for sharing your knowledge, this has been a nice thread.

Can you elaborate further on this point?: "version number approach, where the front end can detect a mismatch in type definitions at runtime", I'm curious to know more.

Thread Thread
 
clay profile image
Clay Ferguson

By versions I just mean if you're consuming some public REST service perhaps, there are two ways to be sure that changes in the "API" are matched up between server and client, so one way is to put a version directly into the endpoint URL like "weather.com/v1api/temperatures" and it would switch to "v2api" some day maybe. The other way is just to send back a version in the response which can be anywhere really, in the response, and not necessarily in the JSON. For large corporations where multiple different teams are consuming APIs and having them coordinate perfectly is impossible, this versioning can help.

Collapse
 
franciscomendes10866 profile image
Francisco Mendes

In the procedures, we define the input parameters, which are zod schemas, which are validated at runtime. While the return of the procedures are not being validated at runtime but can be done using the .output() method and within define the return schema.

This way zod infers data types and even validates them at runtime.

Here is an example in the documentation:
trpc.io/docs/output-validation

Collapse
 
clay profile image
Clay Ferguson • Edited

Compile-time type-safety is far superior to runtime-type safety, because you DO want to find your bugs WHEN you compile rather than letting users find your bugs when stuff fails for them at runtime in production environment.

Collapse
 
Sloan, the sloth mascot
Comment deleted
Collapse
 
franciscomendes10866 profile image
Francisco Mendes

I'm glad to know 🙏

Collapse
 
nexxeln profile image
Shoubhit Dash

Amazing article, very informative!

I really like the kysely + atlas cli combo with the codegen for the types. Very solid alternative to prisma.

Collapse
 
franciscomendes10866 profile image
Francisco Mendes

Thanks for the feedback! 🙏 I share the same opinion, I have been using this combo for some time now and I love it. That's why I decided to document about it, so that others can try it.

 
franciscomendes10866 profile image
Francisco Mendes

I am glad to help 😁

Collapse
 
spock123 profile image
Lars Rye Jeppesen

I didn't know Rome -thanks for that tip - looks awesome honestly