DEV Community

Cover image for Migrating your Node.js REST API to Serverless
Adnan Rahić
Adnan Rahić

Posted on • Originally published at hackernoon.com on

Migrating your Node.js REST API to Serverless

I’ve dabbled a fair share in the dark arts of Serverless. Digging into the various pros and cons of not having dedicated servers, or instances you can call your own. Even if they technically are not. They’re just in some undisclosed server farm somewhere floating in the cloud.

Many of the use cases make sense to let the cloud provider handle the server management, scaling and up time. You’re a developer, why should you need to get your hands dirty with the horror of the command line. Ew, the terminal! How do you exit Vim again? *shivers*

Learning new things is not easy, believe me. I’m not in any way an above average developer. Learning is hard, even if you’re a dev, and used to learning new things. Shifting your mindset to using Serverless Architecture is not a small feat. Here’s my take on starting slow. I’ll show you how to use the code you’re already used to, and apply it to a Serverless environment.

If you have an app in production, you can cut costs drastically. With the auto-scaling properties of using Serverless Architecture, you can rest assured it will always serve all the users hitting your API. So, if you ever make it big and get featured on Tech Crunch, the influx of users won’t break all your servers and leave your users hanging. Pun intended.

From server to Serverless

The goal will be to take an existing Express API and edit it slightly to be deployed to AWS through the Serverless framework. I’ll expect you already have an AWS account and an installation of the Serverless framework set up and working on your machine. If not, please check this out and follow along the steps to set up an installation of the Serverless framework. Otherwise, if you more prefer screen casts, here’s a course where I explained it through video.

Let’s set up an old-school server

I’ve taken the liberty of creating a small repo with an Express REST API. It’s from one of my previous articles you may have read. My point for taking an existing Express API is to show how easy it is to migrate it to using Serverless.

First, let’s clone the repo to our machine. We’re grabbing the dev branch where I’ve set up all the necessary modules and configurations.

$ git clone -b dev https://github.com/adnanrahic/nodejs-restful-api.git
Enter fullscreen mode Exit fullscreen mode

This will clone the repo into a directory named nodejs-restful-api. Open it up in a code editor of choice. We have some work to do.

First thing’s first. Installing node modules.

$ npm install
Enter fullscreen mode Exit fullscreen mode

Running npm install will install all modules from the package.json file. That shouldn’t take longer than a few seconds.

Once that’s done we need to configure the database connection. We keep this in the db.js file. Opening it up you see mongoose is connecting to a database connection URL which we keep in an environment variable.

// db.js

var mongoose = require('mongoose');
mongoose.connect(process.env.DB, { useMongoClient: true });
Enter fullscreen mode Exit fullscreen mode

We set this environment variable in an  .env file. A sample file is present, named sample.variables.env. Let’s open it up and rename it to variables.env.

// variables.env

DB=mongodb://localhost:27017/test
Enter fullscreen mode Exit fullscreen mode

The default connection is set up as the local instance of MongoDB. You can use any connection URL you want. MongoDB Atlas or mLab are both fine.

Note : If you want to follow along coding in this tutorial please create a MongoDB Atlas database cluster. This will be used once we deploy the application to AWS. You can follow along the tutorial here to learn how to create an Atlas cluster or this tutorial to create an mLab instance.

What’s left to do is just run the server. Jump back to the terminal.

$ node server.js
Enter fullscreen mode Exit fullscreen mode

If you added a valid database connection URL it should log back Express server listening on port 3000 to the command line.

Using Insomnia, I’ll just quickly add a new user to the database.

Don’t forget to pick “Form URL Encoded” as the content type. Change the method to GET and remove the request body. Now check if the user was added correctly.

Seems right. John is alive and well.

Using this traditional approach with a server and a running Express API is great for various use cases. But, you have to pay for it even if you don’t have any real user throughput. But what’s dangerous is that if you would suddenly get a large influx of users, you’d have to scale it manually. That’s no fun. Serverless does that for you, automagically!

Migrate to Serverless

Guess what, you can use the code above and deploy it to AWS using the Serverless framework with just a couple of minor changes. Actually, you’re just replacing a couple of lines in the server.js file and installing one more module. Lastly, you add a Serverless configuration file named serverless.yml. That’s it!

// server.js

// before

require('dotenv').config({ path: './variables.env' });
var app = require('./app');

var port = process.env.PORT || 3000;
var server = app.listen(port, function() {
  console.log('Express server listening on port ' + port);
});

// after

require('dotenv').config({ path: './variables.env' });
var app = require('./app');

var serverless = require('serverless-http');
module.exports.handler = serverless(app);
Enter fullscreen mode Exit fullscreen mode

We’re replacing the server with the serverless-http module. This module is then given the whole Express app object and exported with a handler. We’ll configure this handler in the serverless.yml file. But first, install the module.

$ npm install --save serverless-http
Enter fullscreen mode Exit fullscreen mode

There we go. Create the new serverless.yml file in the root of the project directory, and paste this code in. It’s very important to keep the indents correct, hence why I’ve added it as a gist.

What’s happening here is that you’re hooking the handler function from the server.js file to the / endpoint. On AWS it will mean the whole app object will be created as a single Lambda function with one main API Gateway route. How cool is this!?

Test and deploy

You may have noticed the plugins section in the serverless.yml file. It states one plugin named serverless-offline. We need this to run a local emulation of Lambda and API Gateway.

$ npm install --save-dev serverless-offline
Enter fullscreen mode Exit fullscreen mode

There we have it. Now just spin up the emulation.

$ sls offline start --skipCacheInvalidation
Enter fullscreen mode Exit fullscreen mode

Test out the same endpoints as we did above and you should see they work exactly the same. Now comes the fun part. Deploying all of this is a breeze. One command and that’s it.

$ sls deploy
Enter fullscreen mode Exit fullscreen mode

The deploy command will return back to you an endpoint. This is your deployed API’s root path.

Would you believe me that this is all that’s required? Well, it is. Feel free to try this endpoint out. It’ll behave just as the local instance did. What’s even cooler is that this is all packaged into a single function. Let me show you.

Do you even log bro?

What does it mean that it’s all just one Lambda function? Most importantly for us is that we’ll only have one cold start. Meaning it’s a lot more manageable to keep the Lambda warm. Whichever request method it gets, it’ll hit the same function. For a small project, this is fine but not that great for larger things. But here’s the kicker. You can build this on a microservice level. The /users route can have one dedicated Lambda while other features can have their own. And all of this is doable with the same code and modules you’re already used to!

Check this out. I’ve been using Dashbird for some time now to monitor my Lambdas and I could not be any happier. I’d never be able to see all of this through CloudWatch alone.

All of the requests are made to the same function even if the methods are different. Some of them are POSTs, others are GETs. But they all fire the same Lambda. I can’t be the only one here hyped about the fact you can write all the code you’re used to already, but deploy it to Lambda instead.

Wrapping up

Today we’ve seen that learning Serverless isn’t that big of a deal. It’s rather easy to migrate an existing app. I mean why wouldn’t you? If you don’t want to pay for your server all the time, and only pay for what you use, it makes perfect sense. I mean, it’s literally almost free to run a small to average sized REST API with Serverless Architecture. Only that makes it viable, not to mention the autoscaling. Maybe it’s time for you to re-think the tech stack for your next project. I hope I’ve made you a believer.

If you want to take a look at all the code we wrote above, here’s the repository. Or if you want to read my latest articles, head over here.

If I’ve intrigued you to learn more about Serverless, feel free to take a peek at a course I authored on the subject.

Hope you guys and girls enjoyed reading this as much as I enjoyed writing it.

Do you think this tutorial will be of help to someone? Do not hesitate to share. If you liked it, smash the unicorn below so other people will see this here on DEV.to.


Disclaimer: Tracetest is sponsoring this blogpost. Use observability to reduce time and effort in test creation + troubleshooting by 80%.


Top comments (0)