loading...
Cover image for Docker For Frontend Developers

Docker For Frontend Developers

akanksha_9560 profile image Akanksha Sharma ・5 min read

This is short and simple guide of docker, useful for frontend developers.

Why should you use docker?

Long long back when business needed other application DevOps team would go out and buy a server, without knowing performance requirements of the new apps. This would involve lot of guess work and wastage of capital and resources which could be used for other apps.

Enter Virtual machines or VM, it allowed us to run multiple apps on same servers. but there is a drawback. Every VM needed entire OS to run. Every OS needs CPU, RAM etc to run, it needs patching and licensing, which in turn increases cost and resiliency.

Google started using containers model long time ago to address shortcomings of VM model. Basically what container model means that multiple containers on same host uses same host, freeing up CPU, RAM which could be used elsewhere.

But how does it helps us developers?

It ensures that the working environment is same for all developers and all servers i.e, production, staging and testing

Anyone can setup the project in seconds, no need to mess with config, install libraries, setup dependencies etc.

In simple terms docker is a platform that enables us to develop, deploy, and run applications with containers.

Let’s take a step back, what does container system look like physically and how is it different from VM.

1.1 Difference between VM and docker

1.1 Difference between VM and docker.

As you can see host and it’s resources are shared in containers but not in Virtual Machine.

With that out of the way, let’s dive.

How to use docker?

For that we need to familiarise ourselves with certain terminology.

1.2 Visualisation of docker images and docker container

1.2 Visualisation of docker images and docker container

Docker image: It is an executable file which contains cutdown operating system and all the libraries and configuration needed to run the application. It has multiple layers stacked on top of each other and represented as single object. A docker image is created using docker file, we will get to that in a bit.

Docker Container: It is a running instance of docker image. there can be many containers running from same docker image.

Containerise simple Node.js App

We would try to containerise very node.js simple app, and create a image:

Your Node.js App

Let’s start by creating folder my-node-app ,

mkdir my-node-app  
cd my-node-app

let ‘s create a simple node server in index.js and add following code there:

//Load express module with `require` directive

var express = require('express')

var app = express()

//Define request response in root URL (/)  
app.get('/', function (req, res) {  
 res.send('Hello World!')  
})

//Launch listening server on port 8081  
app.listen(8081, function () {  
  console.log('app listening on port 8081!')  
})

and save this file inside your my-node-app folder.

Now we create a package.json file and add following code there:

 {

    "name": "helloworld",  
    "version": "1.0.0",  
    "description": "Dockerized node.js app",  
    "main": "index.js",  
    "author": "",  
    "license": "ISC",  
    "dependencies": {  
      "express": "^4.16.4"  
    }

 }

At this point you don’t need express or npm installed in your host, because remember dockerfile handles setting up all the dependencies, lib and configurations.

DockerFile

Let’s create dockerfile and save it inside our my-node-app folder. This file has no extension and is named Dockerfile . Let go ahead and add following code to our dockerfile.

    # Dockerfile  
    FROM node:8  
    WORKDIR /app  
    COPY package.json /app  
    RUN npm install  
    COPY . /app  
    EXPOSE 8081  
    CMD node index.js

Now what we are doing here:

FROM node:8 — pulls node.js docker image from docker hub, which can be found here https://hub.docker.com/_/node/

WORKDIR /app -this sets working directory for our code in image, it is used by all the subsequent commands such as COPY , RUN and CMD

COPY package.json /app -this copies our package.json from host my-node-app folder to our image in /app folder.

RUN npm install — we are running this command inside our image to install dependencies (node_modules) for our app.

COPY . /app — we are telling docker to copy our files from my-node-app folder and paste it to /app in docker image.

EXPOSE 8081 — we are exposing a port on the container using this command. Why this port ? because in our server in index.js is listening on 8081. By default containers created from this image will ignore all the requests made to it.

Build Docker Image

Show time. Open terminal , go to your folder my-node-app and type following command:

     # Build a image docker build -t <image-name> <relative-path-to-your-dockerfile>

    docker build -t hello-world .

This command creates a hello-world image on our host.

-t is used to give a name to our image which is hello-world here.

. is the relative path to docker file, since we are in folder my-node-app we used dot to represent path to docker file.

You will see an output on your command line something like this:

    Sending build context to Docker daemon  4.096kB  
    Step 1/7 : FROM node:8  
     ---> 4f01e5319662  
    Step 2/7 : WORKDIR /app  
     ---> Using cache  
     ---> 5c173b2c7b76  
    Step 3/7 : COPY package.json /app  
     ---> Using cache  
     ---> ceb27a57f18e  
    Step 4/7 : RUN npm install  
     ---> Using cache  
     ---> c1baaf16812a  
    Step 5/7 : COPY . /app  
     ---> 4a770927e8e8  
    Step 6/7 : EXPOSE 8081  
     ---> Running in 2b3f11daff5e  
    Removing intermediate container 2b3f11daff5e  
     ---> 81a7ce14340a  
    Step 7/7 : CMD node index.js  
     ---> Running in 3791dd7f5149  
    Removing intermediate container 3791dd7f5149  
     ---> c80301fa07b2  
    Successfully built c80301fa07b2  
    Successfully tagged hello-world:latest

As you can see it ran the steps in our docker file and output a docker image. When you try it first time it will take a few minutes, but from next time it will start to use the cache and build much faster and output will be like the one shown above. Now, try following command in your terminal to see if your image is there or not :

    # Get a list of images on your host 
    docker images

it should have a list of images in your host. something like this

    REPOSITORY    TAG      IMAGE ID      CREATED         SIZE  
    hello-world   latest   c80301fa07b2  22 minutes ago  896MB

Run Docker Container

With our images created we can spin up a container from this image.

    # Default command for this is docker container run <image-name>  
    docker container run -p 4000:8081  hello-world

This command is used to create and run a docker container.

-p 4000:8081— this is publish flag, it maps host port 4000 to container port 8081 which we opened through expose command in dockerfile. Now all the requests to host port 4000 will be listened by container port 8081.

hello-world — this is the name we gave our image earlier when we ran docker-build command.

You will receive some output like this :

    app listening on port 8081!

If you want to enter your container and mount a bash terminal to it you can run

    # Enter the container
    docker exec -ti <container id> /bin/bash

In order to check if container is running or not, open another terminal and type

    docker ps

You should see your running container like this

     CONTAINER ID    IMAGE        COMMAND                  CREATED    
    `<container id>`  hello-world  "/bin/sh -c 'node in…"   11 seconds ago

    STATUS              PORTS                    NAMES  
    Up 11 seconds       0.0.0.0:4000->8081/tcp   some-random-name

It means our container with id <container id> created from hello-world image, is up and running and listening to port 8081.

Now our small Node.js app is completely containerised. You can run http://localhost:4000/ on your browser and you should see something like this:

1.3 Containerised Node.js App

1.3 Containerised Node.js App

Voilà, you have containerised your first app.

Posted on by:

akanksha_9560 profile

Akanksha Sharma

@akanksha_9560

Javascript Dev , Hobbit of The Shire

Discussion

markdown guide
 

Oh, also you can get more bang for your buck...

if you use

FROM node:8-alpine

or

FROM gcr.io/distroless/nodejs

Becasue... Size matters 😜, and so does security 🔒

Image Size
node:8 681MB
gcr.io/distroless/nodejs 76MB
node:8-alpine 69.7MB
 

You are right Derek, slimming your images is important, but then i think that is an vast agenda having your multistage bulids, and trimming your code among other things :)

 

Indeed. But hypothetically if you could only do one thing, utilizing alpine or distroless is a low hanging fruit with a huge ROI.

Because, even if you do a multistage build without it you won't trim too much in comparison.

Image Size
node:8 681MB
node:8 with multi stage build 678MB

👆🏽is a basic Hello World express app

Docker Slim: Hold my beer 😅

Take a look at docker-slim - dockersl.im/

🍺held! Very cool

Confirmed! They've successfully implemented the middle-out compression 😆
confirmed

 

Earlier when I started on Docker tutorials online, I couldn't understand how different OSs make it possible to run an isolated environment without having a VM.

Later only I understood that Docker is designed for Linux and uses a kernel-level isolation feature that is built into the Linux OS. And when installing Docker in Windows or Mac, they are running Docker inside a Linux OS that is running on a VM inside that Windows/Mac computer.

I really like every Docker tutorial to include the line clearly saying that "A Container is all about filesystem/resources isolation, which is a system-level feature built-into Linux, and Docker is a tool that abstracts this feature."

 

So, if we want to use Docker in Windows or Mac, we will end up using VM technology at the end of the day. Is that correct?

 

Correct. From Wikipedia: "Docker on macOS uses a Linux virtual machine to run the containers. It is also possible to run those on Windows using Hyper-V or docker-machine."

Windows has two types of containers — Windows Server Containers and Hyper-V Isolation — in which Hyper-V is a VM. More details are here - docs.microsoft.com/en-us/virtualiz...

I understand that production servers are running on Linux machines.

--

Note: Microsoft will soon start shipping a full Linux kernel within Windows. This may change how Docker runs in Windows computers. Let's wait and see.

 

Still the same article without a proper solution to work with IDE feature as autocomplete.

node_modules is installed in the container and you also need to install it locally and try to not mess with it when you mount your source if you want to have the best of the 2 worlds. (Currently working on it and writing an article describing the need and the issue all developer met).

Otherwise, nice article :)

 

You should check out Gitpod. It builds your image together with the project within it, deploys it in the cloud and provides VS Code like browser IDE with autocomplete and so on. Also VS Code releases remote extensions which are deployed in containers. Although i'm not sure how they get files from host to container os, if they mount them then you will get the same issues.

 

If I understand you right, VS Code just implemented an extension so you can use vs code in the container environment. It needs the insiders build. They just announced this yesterday.

 

In order to solve this problem, just develop locally on your container no?

 

That's not possible, how can I use an IDE that way and be productive? Using Vim or other text editor is not the solution at all :)

 
 

I'll try answer with my bad english :)

It's very good question. Docker creates layers for each command COPY/ADD (and some others, you need to read documentation). In build time docker will see to the changes, if in some layer will detected change all below layers will be rebuild.

For example, assume we have like this Dockerfile:

 WORKDIR /app  
 COPY . /app  
 RUN npm install  

And we will change source code very frequently, then docker will execute npm install for each change. It's very bad and not efficient. Because for each little change you will reinstall whole nodejs packages (and if you not using volume or cache it will take loooong time).

In this case:

 WORKDIR /app  
 COPY package.json /app
 RUN npm install
 COPY . /app  

we will execute npm install only when package.json changes (some package added or removed so on).

 
 

Thank you for the detailed response! Your English are perfect btw.

 

Great explanation. I too had the same question.

 

Thank you very much for this write up! It's very easy to understand and I have been curious about docker because I see the term often.

In your example you have set up a node server. As of right now I don't need to do that because I'm focused on front end things. Do you see any reason to use docker for front end development? (I.e. simple webpages that don't require backend services)

Also, one thing that is unclear to me about docker containers is, where are the files (for example if I make hello.txt in the container? Are those files in the docker file folder? Can I access them if I'm not using the docker container? In a docker container if I run cd ~, where does that take me? Virtual or real home?

 

Hey Dan,

Thank you for your feedback :)

While dockers are not a necessity to develop frontend, it is advisable to do so. For example there are simple webpages, you need a server which serves it to consumers, that entire setup should be dockerized (in my opinion) so that other devs do not have to put effort in setting up their local server. It just makes development easier. Also, next part is deployment
While the config for that is little more complex, but in simple terms docker would allow you to scale it as and when needed based on traffic.

For your second question, for example you have my-app folder, then your folder structure is going to be like this:
Docker folder structure

Then your docker file your can write

# Dockerfile  
    FROM some image that you want
    COPY hello.txt  <to wherever you want your file in docker container, can be a folder or not>
    EXPOSE 8081  

Also for your last question , Inside container, paths that you navigate, they point to virtual paths inside container.

 

Wow thank you! I understand now about the COPY part in the docker file. I had to go back up and re-read your explanation. Makes sense now.

I'm certainly interested in trying these things out - but at the moment I think it will introduce a great many other things that I need to learn and I think I will wait for a bit. But your post certainly helped clear some things up in my head about what docker containers are all about.

So please correct me if I'm wrong, but I'm curious about the following:

  • locally you have a container set up for development. You have this docker file and etc.
  • locally anyone on your team can then easily set up the same environment because they can use the same docker file
  • server side you also have the same docker file but that container is constantly running to serve a website or whatever

I assume that you would also use version control for all of this, including the docker file. I also assume that the container running on the server has some kind of way to update files from version control when a commit is done on a certain branch. (I know that hooks exist for this sort of thing with GitHub but I don't have experience with this yet.)

If you have the time, I'd be interested to read more about this sort of stuff in another dev.to post. Diagrams are helpful for sure! There's a lot of moving parts for all of this technology and it's a challenge to see how things fit together.

Hey Dan,

Yes using dockerfile, you can build and share your image. There is a version control for docker images, hub.docker.com/, here you can store and share your images.

I did not understand your last question, could you please elaborate?

 

Thanks for the post 👍🏽. Not only I learnt what docker is, I also learnt how to use docker in node.js 🙂

 

Thaaaaaaaaaank you! I was loking for a tutorial like this. <3

 

I do more frontend at work, and I just needed a quick intro to the concepts so I can grasp what the DevOps guys are talking about. 😅
Didn't want to sit through a 2-hour tutorial, so this is exactly what I needed.
Thanks for this short-and-sweet demo!

 

This is awesome, absolutely love this platform because of such information, I'm going to complicate things about and try with ASP.NET

 

Microsoft has really nice base images that will do this, it's a bit tricky because the sdk you need to compile is separate from the runtime you need to run it so they do a 2 pass build.

It's not bad, but I would definitely start with their examples: docs.microsoft.com/en-us/aspnet/co...

 
 

Hello Akanksha:

Great tutorial - now I have a better understanding of docker. But when I run the following command, I get - > /bin/sh: 1: mode: not found.?

ubuntu@ubuntu-VirtualBox:~/my-node-app$ sudo docker container run -p 4000:8081 hello-world
/bin/sh: 1: mode: not found

Thanks and much appreciated.
/Biju

 

Hi Biju,

It is hard to say what should be causing this error. Maybe you need to source /bin/sh. I maybe wrong. Do not have a lot of experience in ubuntu. Sorry :(

 

Nice post!

Here my little contribution for front-end project with docker and bundler (also valid with few mods to help on backend/ full stack projects).

dev.to/joelbonetr/simplifying-dock...

Hope it helps :)

 
 

The first few paragraphs are wonderful, you have a knack of explaining things in simple language, keep it up. +1 from my side.

 

Hi, I'd like to translate this excellent tutorial to Chinese. The translated text will be published at nextfe.com
Can you give me the permission?

 

Yes sure, send me the link when it is published :)

 

Just finished translation: nextfe.com/docker-for-frontend-dev...

During translation, I noticed some possible typos:

It is a running instance of docker image. there can be many containers running from same docker image.

t -> T

containerise very node.js simple app

a very simple node.js app

 

Thanks for explain each line by line of Dockerfile.

 

I build my react apps with webpack and point nginx to the dist folder. Would there ever be a point in using docker for a setup like this?

 

Really nice article 🙌
but I think you should do a part 2 with docker-compose and volumes so backend devs can use it too. 👍

 

Thanks for the intuitive tutorial!

 

Is it actually effective? I just started with docker, but it just slows down my deving? Nice explaination btw :')

 

Actually I have implemented the docker approach earlier , but for front-end development I found it of no practical usage. Because in front-end we are mostly dealing with browser , styling and dockers are more deployment oriented and not development oriented and even though having your platform os separate from base os is a good development practice but if you have those services you can dockerize that services and consume them in the frontend development environment as locally available api endpoints also as dockers are process so it's better to use them like that only. You can dockerize your build system , probably attach jenkins with docker to create a build platform for your frontend application. But it's best to use them in a separate cloud system than your own system. Eventhough deploying dockers are super simple but the whale eat's a lot if you don't know what and how to manage.Secondly you will need to tell the devs to configure their machines for local development , because dockers are not vm's. People working with docker will realize what I mean.

 

I had the same exp trying to isolate my dev env locally with docker and mini-kube. You have to install, configure, manage them, spend your disk space, memory and CPU cycles on running it. It is better when these techs are just implementation details sitting in the cloud. For example, as Gitpod allows develop dev.to itself from browser: dev.to/ben/spin-up-a-local-instanc... Setup is automated with Docker image launched in the cloud but you don't need to care about it.