DEV Community

Cover image for A crash course on Serverless with Node.js
Adnan Rahić
Adnan Rahić

Posted on • Originally published at hackernoon.com on

A crash course on Serverless with Node.js

Regardless of your developer background, it’s inevitable you’ve heard the term Serverless in the past year. The word has been buzzing around in my ears for longer than I dare say. For too long have I been putting off looking into it in more detail. Well here goes nothing.

Ready? Today we’ll go through the awesomeness and pain points of using Serverless. We’ll also define main keywords and topics that are crucial in getting started with the technology. Lastly we’ll go ahead and jump right into some code and write our own Serverless functions, emulate the envirnoment locally and monitor performance! Keep in mind, you will need an AWS account to follow along with the code examples. Luckily, they have incredible free tiers so you don’t need to worry about breaking the bank when playing around with new things.

You were saying?

How come going Serverless is so cool all of a sudden? Is it good not so use servers anymore? I love servers, why not use them? Servers are awesome. You use the command line to tell them what to do. Why would anybody want to give that up? I was genuinely flabbergasted. But hey, taking a step back, I realize they’re not optimal. They’re a pain to manage in clusters. They don’t scale gracefully. These are only the first things that come to mind.

Let’s switch our mindset completely. Think about only using functions. No more managing servers. You only care about the code. Sounds rather cool. We as developers shouldn’t need to do the tedious work on the command line. Let the ops guys handle that. What do we even call this type of architecture? Only using functions? Small functions? Tiny services?

Functions as a Service (FaaS)

It’s called functions as a service, and it’s amazing. The concept is based on Serverless computing. It gives us the ability to deploy any individual piece of code, or function. The code runs and returns a value, in turn ending the process. Sounds simple, right? Well, it is. If you’ve ever written a REST API you’ll feel right at home. All the services and endpoints you would usually keep in one place are now sliced up into a bunch of tiny snippets, microservices. The goal is to completely abstract away servers from the developer and only bill based on the amount of times the functions have been invoked. Meaning services such as these are easy to scale.

But, all is not so sunny on this side of the fence. FaaS has been going though some teething issues. How do you think errors are handled? Not having a physical server to monitor is a bit of a mind bending experience. Having insight in your system is reasonably hard. Especially on a larger scale.

Wading into shallow waters

To get an understanding of how to write Serverless applications we first need to touch on the subject of what lies behind it all. The tools and services at our disposal that makes it all possible.

AWS Lambda

AWS Lambda is a compute service that lets you run code without provisioning or managing servers. – AWS Documentation

Lambda is an event based system for running code in the cloud. You don’t worry about servers, only the code you write. It scales automatically and only charges you for the time it actually is running the code, the compute time. But, most importantly, it scales automatically! How awesome is that. No more worrying about if the EC2 instance you spun up is large enough to serve all your users.

AWS API Gateway

Lambda would be incomplete without the API Gateway. All lambda functions require an event to be triggered to invoke them. Gateway provides the REST endpoints which trigger the functions. Imagine you have the average Express app. You would usually create an app.get() method for a particular route, like this:

app.get('/', function(req, res, next) { /* execute some code */ });
Enter fullscreen mode Exit fullscreen mode

When a user hits the '/' route an event will trigger the callback function. Gateway is the route, Lambda is the callback function.

The Serverless framework

Managing all of this is a pain. First time I tried it out, it took me the better part of a day to figure out what does what. With good reason. The docs are too complex, not beginner friendly at all. Serverless to the rescue!

Serverless is your toolkit for deploying and operating serverless architectures. Focus on your application, not your infrastructure. – Serverless.com

The Serverless framework bundles all the tools you need into a manageable package, making it simple and straight forward to create and deploy serverless applications. It’s so awesome, it abstracts away all the tedious tasks you have to do in the AWS Console, such as creating functions and connecting them to events. The only downside is that you have to push code to AWS every time you wish to test your functions, while emulating the environment locally is a bit of pain.

The use cases when Serverless is the better choice are vast. Because of the easy scaling and low maintenance, any application you have in production where your user throughput varies rapidly is a valid contender to use serverless architecture. Lastly, if you suck at the Linux shell, and if DevOps is not your thing, you have every reason to try Serverless.

A new mindset

Serverless architecture is unforgiving. That’s a fact. Only setting it up takes a fair share of mental power. I’m not counting emulating it locally. That’s a whole other beast altogether.

The hostility requires us to change our approach. We have to live with the fact that we do not have overview of our whole system. But, humans adapt and overcome. In comes the Serverless framework like a knight in shining armor.

Let’s jump in and create a simple Serverless function.

Setting up Serverless is simple. You need to install it trough npm and hook up your AWS account. Don’t worry, if you get intimidated by the AWS Console, it’s perfectly fine. I’ll break down the process and we’ll go through everything step by step.

1. First of all you need to install Serverless globally.

Fire up a terminal window and run:

$ npm install -g serverless
Enter fullscreen mode Exit fullscreen mode

You’ve now installed the Serverless framework globally on your machine. The Serverless commands are now available to you from wherever in the terminal.

Note: If you’re using Linux, you may need to run the command as sudo.

2. Create an IAM User in your AWS Console

Open up your AWS Console and press the services dropdown in the top left corner. You’ll see a ton of services show up. Go ahead and write IAM in the search box and press on it.

You’ll be redirected to the main IAM page for your account. Proceed to add a new user.

Pick a funky name for your new IAM user and give the user programmatic access. Proceed to the next step.

Now you can add a set of permissions to the user. Because we are going to let Serverless create a delete various assets on our AWS account go ahead and check AdministratorAccess.

Proceeding to the next step you will see the user was created. Now, and only now will you have access to the users Access Key ID and Secret Access Key. Make sure to write them down or download the .csv file. Keep them safe, don’t ever show them to anybody. I’ve pixelized them even though this is a demo, to make sure you understand the severity of keeping them safe.

With that done we can finally move on to entering the keys into the Serverless configuration.

3. Enter IAM keys in the Serverless configuration

Awesome! With the keys saved you can set up Serverless to access your AWS account. Switch back to your terminal and type all of this in one line:

$ serverless config credentials --provider aws --key xxxxxxxxxxxxxx --secret xxxxxxxxxxxxxx
Enter fullscreen mode Exit fullscreen mode

Hit enter! Now your Serverless installation knows what account to connect to when you run any terminal command. Let’s jump in and see it in action.

4. Create your first service

Create a new directory to house your Serverless application services. Fire up a terminal in there. Now you’re ready to create a new service. What’s a service you ask? View it like a project. But not really. It's where you define AWS Lambda Functions, the events that trigger them and any AWS infrastructure resources they require, all in a file called serverless.yml.

Back in your terminal type:

$ serverless create --template aws-nodejs --path my-service
Enter fullscreen mode Exit fullscreen mode

The create command will create a new service. Shocker! But here’s the fun part. We need to pick a runtime for the function. This is called the template. Passing in aws-node will set the runtime to Node.js. Just what we want. The path will create a folder for the service. In this example, naming it my-service.

5. Explore the service directory with a code editor

Open up the my-service folder with your favorite code editor. There should be three files in there. The serverless.yml contains all the configuration settings for this service. Here you specify both general configuration settings and per function settings. Your serverless.yml looks like this, only with a load of comments.

# serverless.yml 
service: my-service

provider:   
  name: aws   
  runtime: nodejs6.10

functions:
  hello:
    handler: handler.hello
Enter fullscreen mode Exit fullscreen mode

The functions property lists all the functions in the service. You can see hello is the only function currently in the handler.js file. The handler property points to the file and module containing the code you want to run in your function. By default this handler file is named handler.js. Very convenient indeed.

Opening up the handler.js you’ll see the handler module and function named hello. The function takes three parameters. The event parameter represents the event data passed to the function. The context tells us about the context of the function, it’s running time, state and other important info. The last parameter is a callback function which will send data back. In this example the response is sent back as the second parameter of the callback function. The first always represents an error. If there is no error null is passed along.

// handler.js
module.exports.hello = (event, context, callback) => {
  const response = { statusCode: 200, body: 'Go Serverless!' };
  callback(null, response);
};
Enter fullscreen mode Exit fullscreen mode

This is all great, but we still can’t trigger the function. There is no event connected to it, hence no way to trigger the function. Let’s fix this. Jump back to the serverless.yml and uncomment the lines where you see events: .

# serverless.yml 
service: my-service

provider:   
  name: aws   
  runtime: nodejs6.10

functions:
  hello:
    handler: handler.hello
    events: # uncomment these lines
      - http:
          path: hello/get
          method: get
Enter fullscreen mode Exit fullscreen mode

Watch out so you don’t mess up the indentation of the file, events should be directly beneath handler. Great, with that done we can finally deploy the function to AWS.

6. Deploying to AWS

The deployment process is very straightforward. Within the service directory run this command in your terminal:

$ serverless deploy -v
Enter fullscreen mode Exit fullscreen mode

You’ll see the terminal light up with a ton of messages. That’s the -v doing its magic. Gotta love those verbose logs!

But, most important for us is that it will log back the endpoint. Serverless has automagically created an API Gateway endpoint and connected it to the Lambda function. How awesome is that!? Hitting the endpoint in the browser will send back the text Go Serverless!

Note: If you want to test the function through the command line you can run:

$ serverless invoke -f hello -l
Enter fullscreen mode Exit fullscreen mode

This will return back the full response object as well as info regarding the state of the Lambda function, such as duration and memory usage.

Relieving the pain

It sucks that I have to deploy the function to AWS every time I want to test it out. Wouldn’t it be awesome if there was a way to emulate the environment locally?

With that awkward digression, voilà , Serverless Offline! Now I can finally test all the code locally before pushing it to AWS. That relieves a lot of stress on my back.

It’s surprisingly easy to add Serverless Offline to your services. Installing one npm module and adding two lines to the serverless.yml is all you need.

No better way to prove it to you than to show you.

1. Initialize npm in the service directory

Now you need to step inside the my-service directory and open up a terminal window in there. Once inside you can run:

$ npm init
Enter fullscreen mode Exit fullscreen mode

2. Install Serverless Offline

With npm initialized there nothing more to do than just run the installation.

$ npm install serverless-offline --save-dev
Enter fullscreen mode Exit fullscreen mode

The --save-dev flag will save the package as a development dependency.

Before moving on, you first need to let the terminal know it has a new command available. So within the serverless.yml file add a two new lines.

# serverless.yml 
service: my-service

provider:   
  name: aws   
  runtime: nodejs6.10

functions:
  hello:
    handler: handler.hello
    events:
      - http:
          path: hello/get
          method: get

# adding these two lines
plugins:
  - serverless-offline
Enter fullscreen mode Exit fullscreen mode

3. Run it locally

To make sure you’ve installed everything correctly run:

$ serverless
Enter fullscreen mode Exit fullscreen mode

You should see an option named offline among the various choices listed. If you do you’re good to go.

Note: If you want to see more helpful information about Serverless Offline run serverless offline --help in your terminal window.

With all that out of the way, go ahead and spin up the local emulation of Lambda and API Gateway.

$ serverless offline start
Enter fullscreen mode Exit fullscreen mode

You’ll see all your routes listed in the terminal. Your Lambdas are now running on your localhost. The default port is 3000. Feel free to open up a browser and check it out. Hitting the endpoint http://localhost:3000/hello/get will send back the same text as in the example above with the deployed function.

How awesome is this. Now we don’t need to constantly push code to AWS to see if it’s working. We can test it locally and only push when we’re sure it works.

Watching my back

In traditional applications when something breaks, you know about it. You also know where it broke. Got to love those stack traces! Regardless, the monitoring process of such apps is pretty straightforward. How does this relate to using Serverless? The logs on AWS CloudWatch are horrible. It took me an eternity to find failing functions for simple applications, imagine the horror with large scale applications.

What I found as a great alternative is Dashbird. It’s free and seems promising. They’re not asking for a credit card either, making it a “why not try it out” situation.

It takes 5 minutes to get up and running with the service mainly because of the great getting started tutorial they have.

Hooking Dashbird up with Serverless finally let’s me see what’s going on in my app. Pretty cool to have someone watching your back.

Errors are highlighted, and I can see the overall health of my system. What a relief. It also tracks the cost. Don’t worry about blowing the budget. Even real-time monitoring is included. Now that’s just cool.

Tools like this make it a walk in the park to manage large scale applications.

Wrapping up

What a journey. You have now witnessed the transition from traditional web development into the serverless revolution. With these simple tools we now have everything we need to create awesome, scalable, and reliable applications.

The only thing left holding us back is our own mindset. Realizing that functions are not equal to servers will be the turning point. But, we are going in the right direction. Tools like Serverless, and Dashbird ease the painful transition incredibly well. They have helped me a great deal on my path down the great unknown of serverless architecture.

I urge you to continue playing with these tools. Try to include them into your existing development process. You’ll feel relieved with how much support you suddenly have. It does wonders for the nerves as well.

If you want to take a look at all the code we wrote above, here’s the repository. Or if you want to read my latest articles, head over here.

My Latest Stories

Hope you guys and girls enjoyed reading this as much as I enjoyed writing it.

Do you think this tutorial will be of help to someone? Do not hesitate to share. If you liked it, smash the heart below so other people will see this here on Dev.to.


Disclaimer: Tracetest is sponsoring this blogpost. Use observability to reduce time and effort in test creation + troubleshooting by 80%.


Top comments (4)

Collapse
 
dandv profile image
Dan Dascalescu • Edited

Nice article! A couple questions:

  1. Can you edit step 5 to explain what - http:\n path: hello/get means in the .yml?
  2. you first need to let the terminal know it has a new command available - which terminal?
  3. Could you add some notes about Lambda drawbacks? The article sounds so rosy, but for example, a 5-second startup time can be unacceptable.
Collapse
 
bgadrian profile image
Adrian B.G.

Very nice! We, the new AWS users need all the help as we can get, it looks very complex at the first glance.

Only 2 remarks for the readers
serverless == a centralized set of servers that we don't control
Google released their own Lambda with nodejs called Functions

Collapse
 
adnanrahic profile image
Adnan Rahić

I'm glad you liked it. Yes, all three major cloud providers have their version of FaaS architecture. AWS has Lambda, Azure has Azure functions, and Google Cloud has Functions. The cool thing is that the Serverless framework supports all three! And even more cool, every deployed function runs inside a container. So every function gets its own environment and runtime, isolated within this container. This is why the Lambda functions respond faster after an initial couple of requests. Because the container is created only for the first invocation, and then re-used for the following requests. This just blows my mind. It must have been very hard to create.

Collapse
 
ajaymanikanta1123 profile image
Guddeti Ajay Manikanta ⭐⭐⭐

Great Article.