When Docker came out in 2013, the benefits being touted were pretty clear. "Full isolation from host machine and other apps", "perfectly-reproducible environments", and the "works on my machine" chant finally a thing of the past. Considering that most of my time troubleshooting issues was getting my code that works perfectly in development to run in production properly, Docker seemed like a great antidote.
However, what held me back from adopting it for another, like, 5 years, was that it was just too difficult to understand. I don't like working with tech stacks that I don't understand, because that's just another issue I'd have to troubleshoot down the road when things don't go totally perfectly. Despite having to do a little more work to configure the server where my code would ultimately live and run, and sometimes deal with some painful config issues, at least I had encountered many of those issues previously and knew how to fix them.
Today, Docker is so pervasive that it's actually become quite difficult to avoid. It seems like just about every project on Github has a
Dockerfile and instructions in the
readme on how to use it. Fortunately, it has, in my opinion, started to live up to its promise of actually making our lives as devs easier. I still haven't quite gone "all-in" with Docker, but there are areas where it has genuinely improved my happiness and productivity.
There are certain stacks that I find a huge pain to set up again and again, when switching computers for example. These tend to be things like relational database (MySQL and Postgres) and any type of app that normally has a really crazy setup procedure (like self-hosted Gitlab).
How many times have you run into issues setting up MySQL on your computer? Especially on Linux? I always seem to struggle with that one. Adding the right package repository in Ubuntu, getting through the CLI installation unscathed, and booting it up. All of the different configuration possibilities allow a lot of things to get messed up on that initial install and bootup.
Or, I can just do this, and have a fully-working MySQL server ready to go!
docker run --name mysql-5 -p 3306:3306 -e MYSQL_ROOT_PASSWORD=my-secret-pw -d mysql:5
From there, I just use my favorite MySQL editor (Sequel Pro, HeidiSQL, etc.) and I can connect any service on my machine that needs MySQL, Docker or not, to use it as needed. And, I can boot up as many as I want (or as many as my hardware would allow) and just change the exposed port on those new instances (like 3307, 3308, etc.) so there isn't any conflict.
Also, I like to test emails on my local machine without relying on en external service to handle it. Instead, I can just throw in a quick SMTP trapping service called Mailhog that allows me to actually see the output of my locally-tested email!
docker run --name my-mailhog -p 1025:1025 -p 8025:8025 -d mailhog/mailhog
This command is even simpler. Just provide a name of your choosing to identify the container, the ports you want exposed (in this case, 1025 for the SMTP port and 8025 for the web portal to see the trapped emails), and the name of the image in Docker Hub (
mailhog/mailhog). This command will boot up the server and allow you to see your trapped emails. It's incredibly easy.
So, I could use Docker with my main app codebase, but I just continue to use NodeJS without even worrying about Docker-izing it. Also, these examples aren't necessarily helpful for deploying to production (which Docker can certainly help with, with the right knowledge), they were just to help demonstrate setting up a nice development environment with less fuss.
If you're finding it difficult to get into and understand Docker, or feeling left behind by the hype train, don't worry. You can make Docker work for you, without feeling like you need to ditch your VMs and anything not totally containerized. I have found a middle-ground that is working very well for me at the moment. As time goes on, I'm sure I'll adopt more and more pieces as it makes sense. I hope you can do the same as well.