Cover image for Frontend Development with Docker simplified
Emarsys Craftlab

Frontend Development with Docker simplified

blacksonic profile image Gábor Soós Updated on ・4 min read

Docker is a great tool that helps developers build, deploy, and run applications more efficiently in a standardized way. For frontend applications, we only need the Docker image for local development, because we deploy it to a static hosting provider. In this case, can we live without a custom Docker image? Can we have the same development experience that we had without Docker? Yes, it is easier than you think.


Assume an application where we only have to press start, and everything is running. This setup can be any application generated by the React, Vue, Angular CLI. For demonstration purposes, I'll use my Vue Todo application.

During development, we will be doing the following steps:

  • install dependencies with npm install
  • start the application with npm start
  • modify a file and check the changes in the browser
  • use code-completion of modules in the editor
  • add a new dependency to package.json and install it

Custom Docker file

If you search the web for frontend development with Docker, you can find many articles using a custom Docker image. Let's have a look at and see how it works.

Custom Dockerfile

The Docker file starts with defining the base image (Node.js 12.x) on what we will build upon (FROM) and setting the working directory to the /app folder (WORKDIR). Every command starting with RUN or CMD will have this folder as the default working directory.

The next step is to copy the source files (COPY) and install the dependencies. We copy the package.json separately from the rest of the files. Why? Because Docker caches every step of the Dockerfile when building the image multiple times. When don't modify anything and build the image again, it won't do anything as the steps are cached. If we change a Javascript file, Docker will run the commands from COPY . /app. When we modify the package.json file, Docker will rerun the commands from COPY package.json /app.

By default, applications running inside the container on a specific port are not available on the host machine. We have to make the port available (EXPOSE). Only after this can we type the URL in our browser (http://localhost:8900) and see the result.

To run this image, we have to build it and run the created container.

# Build the image: docker build -t <image-name> <relative-path-to-dockerfile>
docker build -t client .
# Run the image: docker container run -p <host port:container port> <image-name>  
docker container run -p 8900:8900 client


The above Docker image works but has multiple drawbacks:

  • Files generated inside the container are not visible from the host machine, only inside the container. It means that we won't see the node_modules folder on our host machine, and because of this, we lose code-completion in the editor. We can't commit the generated package.lock.json to source control because it is not available on the host machine also.

  • We have to stop, build, and rerun the container on dependency and file changes. We lose the ability of live-reload.

Meet Docker Compose

Docker can build single images and run the built containers. Docker Compose steps a bit further as it can build and run multiple images at the same time. In this tutorial, we won't be using the numerous build capability; we'll use it only to overcome the disadvantages of the previous example.

While we can use the previous Dockerfile to run with Docker Compose, we will use it in a way to skip the writing of a custom image.

Docker Compose

Instead of defining the image with a sequence of commands, Docker Compose uses the YAML config file format. Under the services key, the image for the Vue application is named client. It is the equivalent to the naming in the docker build -t <image-name> command. The description starts the same way here: defining the base image (image) and setting the working directory (working_dir).

The key difference comes from the volumes property. By using it, the local folder is synchronized with the container. If we execute the npm install command in the container, the node_modules folder will appear on the host machine also: we get the code completion and the lock file.

The application starts in the container also (command: sh -c "npm install && npm start"), exposing the port to the host machine is necessary for browser access (ports).

To run this setup, we have to build it and run the built container.

# Build the image and start the container
docker-compose up

If you look at the two solutions they are nearly identical. There is a great correlation between the commands in the Dockerfile and the configuration fields in the docker-compose.yml config file. The only difference is how they handle mounted files and this is what solves our synchronization issue.

Dockerfile vs Docker Compose


When doing local development it is important to have a fast feedback loop and code completion. If we go with the pure Docker solution we lose both. We have to ask for the help of Docker big brother Docker Compose to help us with its folder synchronization. By migrating our setup to Docker Compose we get back the speed and code completion. I hope this trick helps you and saves a ton of development time.

Special thanks to iben for helping me with the setup.

Buy Me a Coffee
Become a Patron

Posted on by:

blacksonic profile

Gábor Soós


Enthusiastic full-stack JavaScript developer/lead, tech writer, speaker at Emarsys

Emarsys Craftlab

We love developing, do you share our passion?


markdown guide

So im having trouble getting mine to work. I just created a plain index.js with console.log('hello world'), my startup script says nodemon index.js which successfully runs the index file and monitors for changes. But when I change the index.js in VSCode it never updates in the container. It just says "watching for changes before restart". This works locally BTW, if I change the index running without docker, it detects the change and updates. Is something wrong with my volumes? I set COMPOSE_CONVERT_WINDOWS_PATHS=1


I've tested the setup again with Vue CLI and it detects changes. Next week I'll be working on the Node.js workflow.


Can you show an example repo?


Sorry it took so long for me to get back to you, here is the repo github.com/VictorioBerra/docker-co...

You should just have to clone and run docker-compose up and then edit the index.js and notice nodemon never sees the file change, and does not restart the process.

In the meantime, I've put together a working Node.js Express app setup that rebuilds with Nodemon on file changes github.com/blacksonic/node-docker-...

I don't know what is watched by Nodemon with the default setting, but to be sure I've specified what to watch, maybe that is the missing piece in your code.


Good summary, thanks!

I wanted to try VS Code Server in docker (docs). Does anybody have experience with it?
It sounds like it would solve the same issue that you solved with docker compose.


I’ve used it primarily for side projects and it’s worked pretty well. Viscose does most of the heavy lifting for you, so the setup is pretty easy.


You can also achieve everything in the compose file through cli arguments to docker. I wouldn't recommend using the cli like that though; the compose file keeps things repeatable.

I would recommend people learning docker learn how to use the cli to achieve what docker compose does, it'll help you understand what's going on underneath.


Can you share the equivalent cli command? I would share it in the article

docker run -v `pwd`:/app -p 8900:8900 node:12 sh -c "npm install && npm start"

In general, even though it might sound rude, but docs.docker.com/engine/reference/c... is pretty exhaustive and should be read first.

Nevertheless your approach to go for a docker compose solution helps to make it better portable also for other users/developer of your code.

Thank you for covering for me! I only just saw the reply comment. In addition to that you will need -w /app to change the working directory.

For completeness for anyone else reading,

  • -v mounts the volume /app to your current path (pwd gets your current path)
  • -p maps the ports. I would also change this -p localhost:8900:8900 just so it's only accessible from localhost
  • node:12 specifies the container image. Docker hub hosts the iamge
  • sh -c "..." is the command to run on entry

Thanks for the article :-)

I am interested in the advantages of developing with docker. Can anyone quickly name a few? One of them might be that you don't need Node installed (not that big deal) and another to run it on another OS... I would really appreciate more 😉 What is the greatest advantage?


You can have the same environment as in production (if the Node.js installation doesn't bother you).


Nice article Gábor.
I will recommend you to use PM2 with watcher mode enabled to re-build and re-run the node server every time you change some piece of code. In this way you don't need to restart de container.


That one is a good point for Node projects 👍


Another thing you can do is create a volume for your node modules because in this setup every time you start your container you will have to download all the dep again.


The volume remains there, it only checks for updates...or am I missing something?


Docker noob here, I thought Docker-Compose utilized the docker files? So you can have a docker compose without a dockerfile?


It can utilize Dockerfiles, but not necessary. Only need the Dockerfile when you want to customize the base image.