Docker is a great conterization platform with tons of out of the features out of the box. So, in this post we are going to skip the traditional hosting of app with packages like pm2 (although we can still use it inside docker).
First of all, we will start by making a
Dockerfile is a way to package your application.
You can learn the basics on docker from the link
The content of the Dockerfile will be like this:
FROM node:10 WORKDIR /usr/src/app COPY package*.json ./ RUN npm install COPY . . EXPOSE 8080 CMD [ "node", "server.js" ]
This will tell the docker engine to use the node:10 image and perform the steps.Though the file is self-explanatory, I will still do a lil bit of explaining
- First it will pull the image from dockerhub if it cannot find it in the machine
- Then it will use the directory
/usr/src/appas the work directory for the project
Thirdly, it will copy package.json and package-lock.json into the work directory and perform npm install which will inturn install all the dependencies required
After, the dependencies are installed, it will copy all the files in the host machine to the container. Since we already have node_modules inside container, we may want to skip it. This can be done via
.dockerignorefile. Think of
gitignorebut for docker
A sample .dockerignore file
# Logs logs *.log # Runtime data pids *.pid *.seed # Directory for instrumented libs generated by jscoverage/JSCover lib-cov # Coverage directory used by tools like istanbul coverage # Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files) .grunt # node-waf configuration .lock-wscript dist node_modules server/*.spec.js
The expose command will open a port in the container follwed by the port number which is 8080 in our case. Make sure to match this with the port used by the app
CMD command will execute the command passed which is
node server.js. It can even be a npm script like
npm start. This should be the command that spins up the server
Go to the directory that has your Dockerfile and run the following command to build the Docker image. The -t flag lets you tag your image so it's easier to find later using the docker images command:
docker build -t <your username>/node-web-app .
Running your image with -d runs the container in detached mode, leaving the container running in the background. The -p flag redirects a public port to a private port inside the container. Run the image you previously built:
docker run -p 49160:8080 -d <your username>/node-web-app
However this approach doesn't reflect the changes that you made in your code after the image is built. So for every change you have to perform the build and run step again and again.
Luckily docker comes with something called volume mapping which instead of copying the file maps the working directory with the files from host machine. So every time a change occurs on any file in your app, it is automatically reflected inside the container as well and wont need to build the image again.
To use this approach , the dockerfile becomes
FROM node:10 WORKDIR /usr/src/app COPY package.json . RUN npm i COPY . .
Once you have modified the file, you can build the image as you did previously
To run the built image though, there is a slight change
docker run -p 49160:8080 -v $(pwd):/usr/src/app -d <your username>/node-web-app
pwd is the command to get the current directory in linux so make sure to use the run command when you are inside the app directory
Throughout the last year, I have worked part-time as a working student and also studied at the university. I was not the first and not the last one who has combined that during their studies, but the problem for me was, that at the end of the day I have felt absolutely exhausted mentally and physically. That caused problems with my health and motivation to continue working on my goals or anything. (yeah, “goals,” I wish I had something more specific at that time).