Introduction
In today’s fast-paced software development landscape, Docker is not just a tool for deployment but a transformative force in development workflows. While its role in streamlining production environments is well-known, Docker's impact on development is equally significant and often overlooked.
In this blog, we'll discover the hidden gems of using Docker for your everyday work and development. We'll go over the current problems in modern application development so that it will be clearer how Docker can solve these problems and make our lives less miserable. We will also look at practical examples of how to set up a Dockerized workflow in 2 different ways and compare them. By the end of this blog, you should be able to convince yourself or your manager why you should containerize your development workflow!
Let's dive into it!
Challenges in Traditional Development
Complex Setup and Configuration
At this point, it became a norm that onboarding new members to the team and installing and configuring the project may take a day or more, following huge and (could be) poorly documented README, and a lot of debugging and StackOverflow-ing. I am not talking about simple Node.js apps that are simply installed via npm install
, I am talking about huge applications and open-source projects, which will require you to install dozens of dependencies and tools just to run the project. I still remember going days over installing a particular open-source repo and still couldn't get the tests to run properly...
Inconsistent Environments, Dependency Conflicts
Each developer has his own environment and software packages and dependencies installed. This will lead to bigger problems when installing and running the project, leading to more debugging and wasted time.
Another major issue, what if you have a Node.JS backend that runs on node.js v16 and a frontend app that runs on node.js v20? Which node.js version will you install? Even if you use NVM (Node Version Manager), only 1 version of node.js will be running at a time, meaning you can't run the backend and frontend together to test them.
Manual Testing Challenges
How would you go about testing the full project stack? You would need to install all the projects and run them together. You're most likely to suffer from one of the above problems.
Need to learn load balancing and test out a load balancer or test out scaling your application? You would need to run multiple instances of your application. How difficult is that will depend on your runtime, but definitely will have to assign different ports manually.
Want to learn SQL databases or Redis? Go into huge installation steps.
At the end, hope that all the new software dependencies and installations you made didn't mess up your system and impact your daily life usage. BTW, good luck uninstalling a project's dependencies.
Now to the solution!
How Docker can fix your life (Err, your development problems)
Eases Onboarding and Contribution to Complex Projects
What if after cloning the repo and with a single command docker compose up --watch
the project just magically runs? No installations or configurations? Wow. So to onboard a new member, or to contribute to a new huge project, you will just use a single command and that's it! All changes they make to the code will instantly reflect to the container, allowing for a seamless and effortless development process.
Simplifies README Files
No need to spend hours documenting how to run the project, just to find out that only you can understand what you wrote. That section of the README now will be even smaller than this point: "run docker compose up --watch
and all your files will automatically sync."
(Well it could still be important but you get the point.)
Ensures Environment Consistency, Simplifies Dependency Management, and Offers Portability
With Docker, you can ensure that all developers are running the same environment, the same software versions, and the same dependencies, no matter what OS or configuration they have. Develop everywhere, run everywhere.
Provides Isolation and Handles Conflicting Dependencies Securely
Remember the problem of having different node.js versions? Now you can run them both in separate containers each with their node.js version, and they won't interfere with each other. Everything is isolated. Everyone is happy. Want to uninstall a project? Just docker compose down
and that's it, no need to worry about uninstalling dependencies.
Enables Practical Learning and Experimentation
Want to learn a new technology or tool? Just pull a Docker image and run it. No need to install anything, no need to worry about messing up your system. Could be a database, a cache, a load balancer, a message broker, etc.
Learning frontend and want some ready-made backend to test your frontend with? No need to install a backend and figure out how to run it, just pull a Docker image and run it. Same for backend developers.
Want to learn system design and how to scale your application? Just run multiple instances of your application in separate containers and test them out, with a load balancer maybe. No need to worry about ports or configurations.
Facilitates Consistent Testing Across Versions
Want to test something on an older version of your project? Just keep track of your Docker images and run that older version. No need to rollback your code or worry about conflicting dependencies.
Want to test across different environments? Just run your project in different containers with different configurations.
Want to test your whole stack? Easy as well. If you have some Docker knowledge, you can do anything you want.
Let's get our hands dirty!
Setting up Dockerized Development Workflow
For this section, we will be working on a Node.JS application. Assuming it runs on node.js v16.
First step is making the Dockerfile.
Dockerfiles allow us to describe the installation and configuration steps that will be run to build the final usable image. This is the part where you install all the dependencies and tools.
# Dockerfile
FROM node:16.20.2-slim
WORKDIR /app
COPY package*.json ./
ENV PORT=3000
EXPOSE $PORT
RUN npm install --force
COPY . .
ENTRYPOINT [ "npm", "run" ]
CMD [ "dev:docker" ]
dev:docker
is a package.json script defined asnodemon server.js --legacy-watch
.
A very simple Dockerfile that first copies the package.json and package-lock.json, then installs the packages, then copies the rest of the project.
Creating the Docker Compose file
Docker Compose files allow us to specify how the Docker image should be run and the specifications of the Docker container. Here we specify how changes will propagate from our local machine to the container.
There are 2 ways for the Dockerized development workflow:
Using Volumes
The old way. You will mount your project directory to the container, so any changes you make in your project will be reflected in the container.
Docker Compose file:
services:
backend:
container_name: my-nodejs-app
image: my-nodejs-app:dev
build: .
ports:
- 3000:3000
volumes:
- ./:/app
- NOT_USED:/app/node_modules
volumes:
NOT_USED:
Notice the volumes
section: In the first line, we are mounting the current directory to the /app
directory in the container. However, we want to exclude the node_modules
directory from the mounting, so we are mounting a volume that is not used to the node_modules
directory in the container. The end result is that now any changes done in our project will be reflected in the container, except node_modules
.
In other runtimes, you exclude the equivalent of
node_modules
directory in that runtime.
For this method to work, the command running must be a dev command; it should watch for changes and restart the server automatically. If you install a new dependency, you will need to manually rebuild the image and restart the container.
You run it via docker compose up
.
Using Develop Specification
The new way, we will use the new develop
specification in the docker compose that was made exactly for this purpose.
Docker Compose file:
services:
backend:
container_name: my-nodejs-app
image: my-nodejs-app:dev
build: .
ports:
- 3000:3000
develop:
watch:
- action: sync
path: ./
target: /app
ignore:
- node_modules
- action: rebuild
path: package.json
In develop.watch
, we have 2 items that define what happens when we change a file in the project directory.
In each item we define the action to take, the path to watch, and the target in the container. Actions can be sync
, rebuild
, or sync+restart
. sync
will sync the changed file to the container, sync+restart
will sync the changed file and restart the container, and rebuild
will sync the changed file, rebuild the image then restart the container.
Our first item tells Docker to sync any changed file in the current directory on our machine to the target path on the container, excluding node_modules
. I am using sync
here because I am using a dev command inside the container, but I could have also used a production command with sync+restart
.
The second item tells Docker to rebuild the image and restart the container if we change the package.json
file, like when we install a new dependency.
You run it via docker compose watch
or docker compose up --watch
.
Learn more here: https://docs.docker.com/reference/compose-file/develop/
You can already see why the new method is superior in every way.
No longer do we need to worry about mounting, volumes or manually rebuilding the image. Now we declare what we need and Docker will do it for us. We did not need a workaround for excluding node_modules
, we are not forced into using a dev command, and we can properly handle new dependencies without manual intervention.
With the Dockerfile and Docker Compose made, you are ready to start your development! All changes you make will be reflected into your application.
Anyone cloning your project and running docker compose up --watch
will immediately start the development process without any additional steps.
Conclusion
With that, we have seen how Docker can transform your development workflow and make your life easier. We have seen the problems in traditional development and how Docker can solve them. We have seen the benefits of using Docker for development and how it can make your life easier. We have seen 2 ways to set up a Dockerized development workflow and compared them.
I would like to request especially from open-source maintainers to consider Dockerizing their projects. Instead of pages of installation steps, you can make the required Docker image and Docker Compose file and let the contributors run the project with a single command. This will make your project more accessible and easier to contribute to.
I hope you enjoyed this blog and learned something new!
Top comments (4)
That's not actually all there is to it.
I have an entire alpine dev container with tmux, helix and a lot of other binareis I've built from source in other build steps to trim down on file bloat. It's 1.1 GB despite shipping with python, go, node, lsps, lots of additional utility programs, and docker.
It's possible to do other quality of life stuff like bind mounting folders from the host, or joining docker networks to talk to other containers. Ended up ditching vs code.
Interesting! It's very powerful indeed
amazing work
lovely
Some comments may only be visible to logged-in visitors. Sign in to view all comments.