DEV Community


Posted on

From Localhost to the Cloud: Deploying my First Node.js App with Docker

As a developer, one of the most satisfying moments is finally getting your web app live on the internet for the world to see! However, turning your locally running code into an accessible web app can be tricky sometimes. I learned this the hard way when I tried to deploy my first Node.js app.

After weeks of late nights and endless debugging, I had a web app that ran flawlessly on my own machine, for once i felt like a genius 😁. But this happiness was short lived. When it came time to launch it on a cloud server, things started breaking left and right! After banging my head on the desk troubleshooting deployment issues, I knew there had to be a better way. That's when I discovered Docker, and it ended up being the magical solution I needed to easily deploy my Node app and many more after!

In this post, I'll walk through how taking the time to Dockerize my application gave me the keys to rapidly deploying it to the cloud with minimal fuss. I hope my experience will convince you to embrace Docker for your next Node.js project! Let's get started on this journey from localhost to the cloud!

My Application Architecture

First, let me tell you a bit about the simple web application I had built. It was called CatsGram, and it allowed users to post pictures of their cats and leave comments for other fuzzy felines. The app had:

  • A frontend written in React that let users upload cat photos + comments
  • A backend REST API written in Node.js and Express that handled the data and storage
  • A MongoDB database to store the cat profiles and comments

Here is what the React frontend looked like:

// Frontend in React

import React from 'react';

const App = () => {

  const handleSubmit = (e) => {
    // Call API to submit form data

  return (
      <form onSubmit={handleSubmit}>
        <input type="file" />
        <button type="submit">Add Photo</button>

export default App;
Enter fullscreen mode Exit fullscreen mode

And here is part of the Express backend that handled the API calls from the frontend:

// Backend API in Express

const express = require('express');
const mongoose = require('mongoose');
const cors = require('cors'); 

const app = express();

// Connect to MongoDB
mongoose.connect('mongodb://localhost/catsgram', {useNewUrlParser: true});

// Cat profile model
const Cat = mongoose.model('Cat', new mongoose.Schema({
  name: String,
  picUrl: String
}));'/cats', async (req, res) => {
  // Create new cat profile
  const newCat = await Cat.create(req.body); 

Enter fullscreen mode Exit fullscreen mode

The app worked flawlessly on my local machine, but deploying it to an online server was another story...

The Deployment Headache

I decided to deploy my app to a popular cloud provider and rented a Linux server. After SSH-ing in, I hit my first roadblock - the server was running an older version of Node than the one I used for development. My app crashed with an error about missing modules and so on!

After fumbling with NVM to try and install the right Node version, I finally got the backend API running. But then the React frontend failed to build due to mismatching webpack versions with the create-react-app starter I used.

Each error I fixed seemed to unveil yet another environmental issue between my local machine and the server. Path issues, missing dependencies, environment variables - you name it!

I was tearing my hair out trying to get things working. I finally conceded defeat and turned to my savior... Docker!

Docker to the Rescue!

Docker is a tool that allows you to package applications into standardized units called containers. These containers bundle up the code, dependencies, system libraries, and settings into an isolated executable package.

The key benefit is that this container will run the same way regardless of the underlying environment. No more worrying about compatibility issues across different machines!

Some other awesome benefits of Docker:

  • Cross-platform portability - Ship your containers to any Linux, Windows, cloud provider, etc
  • Environment consistency - Containers include everything needed to run the app
  • Isolation - Apps run in isolated environments without conflicting with other apps
  • Speed - Containers start instantly compared to virtual machines

Docker seemed like the perfect solution to my deployment woes. By Dockerizing my app, I could neatly package it up with all its needed dependencies and specs into a standardized container. This container could seamlessly run on my local machine for development, then be deployed to the cloud server without any environment mismatches!

Let's look at how I Dockerized the CatsGram app.

Dockerizing the Backend API

The first step was containerizing my Express backend API. Docker uses special Dockerfile configuration files to build container images. Here is the Dockerfile for my backend:

# Dockerfile

FROM node:16-alpine
COPY package*.json .
RUN npm install
COPY . .
CMD ["node", "server.js"]  
Enter fullscreen mode Exit fullscreen mode

This does the following:

  • Starts with a Node.js base image
  • Sets the working directory to /app
  • Copies the backend code into the image
  • Installs dependencies with npm install
  • Specifies the command to run the app - node server.js

With this Dockerfile, I could build a container image for my backend:

$ docker build -t catsgram-api .
Enter fullscreen mode Exit fullscreen mode

This built an image tagged catsgram-api based on my Dockerfile. I could then run a container from that image:

$ docker run -p 4000:3000 catsgram-api
Enter fullscreen mode Exit fullscreen mode

This started a container on port 4000 and mounted the internal port 3000 to be accessible externally. My backend API was now running in an isolated Docker container!

Containerizing the Frontend

For my React frontend, I used a multi-stage Docker build:

# Stage 1 - Build
FROM node:16 AS build
COPY package*.json .
RUN npm install
COPY . .
npm run build 

# Stage 2 - Run 
FROM nginx:alpine 
COPY --from=build /app/build /usr/share/nginx/html
Enter fullscreen mode Exit fullscreen mode

This first installs Node to build the React app, then copies the built artifacts to an Nginx image for the runtime. This gave me a lean production image!

Again I could docker build this and run a container to serve my frontend on port 3000.

Defining Services with Docker Compose

At this point, I had two containers - one for the backend API and one for the frontend. To link them together, I used Docker Compose to define the app services:

# docker-compose.yml


    build: ./backend
      - "4000:3000"

    build: ./frontend 
      - "3000:80"
Enter fullscreen mode Exit fullscreen mode

Running docker-compose up would now start both containers and wire them together!

Deploying to the Cloud

With Docker, deploying these containers to the cloud was a breeze! I pushed my images up to a registry:

$ docker push catsgram-api
$ docker push catsgram-frontend
Enter fullscreen mode Exit fullscreen mode

Then on the server I just had to run:

$ docker pull catsgram-api
$ docker pull catsgram-frontend
$ docker-compose up -d
Enter fullscreen mode Exit fullscreen mode

The containers started up just as they did locally and my app was live on the internet! πŸŽ‰

Docker is Deployment Magic

No more fussing with dependencies, runtimes, builds, etc across different environments. Docker let me develop my app locally as I normally would, then package everything needed up into portable containers ready for deployment anywhere.

Some of the key benefits I saw:

  • Consistent environments - Containers included the exact dependencies and Node runtime needed
  • Cross-platform - I could develop on OSX but deploy the same containers to Linux servers
  • Lightweight - Containers are much more efficient than VMs
  • Modular - Services like frontend and backend were compartmentalized into separate containers

Docker really is a game-changer when it comes to deploying applications. I can now develop apps faster without worrying about environment differences between my machine and servers.

If you're struggling to deploy Node apps, I highly recommend exploring Docker! It will save you those late night "works on my machine" debugging sessions when you'd rather be sleeping.

Let me know if you have any questions! I'm happy to chat more about my experience Dockerizing my first Node app. Wishing you happy coding and smooth deploying!

Top comments (13)

annetawamono profile image
Anneta Wamono

I really liked the way you broke down the process of the front-end, backend and linking them together. I'm pretty new to docker so I didn't quite understand what is docker compose compared to just docker?

ikemhood profile image

Docker compose is a docker too that allow management of deployment of more than one docker application provided they all depends on each other

foolcats profile image
fool-cats • Edited

It just like package.json in NPM,It is a declarative way to use docker.With the help of docker compose, we don't need to run command one by one.

wraith profile image
Jake Lundberg

I too love Docker, especially for development environments! No more fussing trying to get all our team members setup on different computers. Just spin up a container, and magically they're working in the same environment!

Nice article! Keep up the great work!

wakywayne profile image

How does this work with ip/domains? For example, are the front end and back end at different HTTP addresses? And how do you get custom domains linked to the front and in the back in separately?

ikemhood profile image
IkemHood • Edited

Great question! Here's how Docker containers work with IPs, domains, and linking frontends and backends:

By default, Docker containers get their own virtual IP address assigned by Docker's networking. So your frontend and backend containers would have different IP addresses, like for front-end and for back-end.

You don't necessarily need to expose these default IPs publicly. Instead, you can map custom ports on the host machine to the internal container ports.

For example:

# Frontend container
docker run -p 8080:80 frontend-image

# Backend container 
docker run -p 3000:3000 backend-image
Enter fullscreen mode Exit fullscreen mode

This would expose the frontend on the host's port 8080 and backend on port 3000.

To link them, the frontend would just need to make requests to host-ip:3000 to hit the backend API.

Now to use custom domains, you'd point the domains to the host IP, and route traffic to the mapped ports.

For example:

  • Map to host IP
  • Route traffic from to host port 8080

  • Map to host IP

  • Route traffic from to host port 3000

This way your domains are abstracted from the internal Docker ports/IPs.

You can also use a reverse proxy like Nginx to handle routing requests from custom domains to your Docker containers.

Hope this helps explain how to handle networking and domains with Docker! Let me know if you have any other questions.

wakywayne profile image

Excellent answer to my question really appreciate it.

hari124 profile image

How will you host your app?

ikemhood profile image

Most server provider has docker as an option.

When hosting on your own vps, you'd install docker just like you'd do on your machine.

sosana profile image
Zach Sosana

Which server host did you deploy your docker image to?

ikemhood profile image


vectorware profile image
Akalonu Chukwuduzie Blaise

This is really cool, I have faced the same deployment issues since I first tried to deploy my express application. Are there recommended resources one can use to learn Docker? thanks.

ikemhood profile image

yes, one from net ninja on youtube.