DEV Community

Are you using Docker for local development?

Mateusz Bełczowski on November 15, 2019

I've been using Docker and Docker-Compose for more than two years and I really can see a lot of benefits when it comes to CI/CD pipelines and deplo...
Collapse
 
rohansawant profile image
Rohan Sawant

I personally do this weird bit where I always make sure that development is 'possible' in docker, I make sure that everything spins up with a single command with docker-compose.

Then I expose the DB Container through a port and connect to it via my dev environment, or my IDE.

This means one could completely depend on docker if they choose to and partially depend on it if they want to.

Collapse
 
thejoezack profile image
Joe Zack

I think this is the ultimate. You provide all the pieces so other devs can pick and choose the things they want to run locally. You dont have to work 100% in containers to get the benefits.

Collapse
 
mateusz__be profile image
Mateusz Bełczowski

I like this approach, thanks for sharing :)

Collapse
 
sriramr98 profile image
Sriram R

How do you do it? Can you link to some tutorials on how to have a good local dev workflow?

Collapse
 
patryktech profile image
Patryk • Edited

For web projects, I do, regardless how simple they are.

Pros I see:

  • Dev environment matches production environment closely (I run Linux on my machine, as well as on my servers, albeit Arch on the Desktop, Ubuntu Server in the cloud).
  • Keep every project isolated, and all dependencies outside my host (which could also be achieved to some extent with VirtualEnv, pipen, NVM, etc., of course).
  • Each DB service is also isolated. Using project environment variables means that I don't have to bother managing Postgres users and 10 different databases. Instead, I let docker spin up a postgres service for me, and it automagically configures itself.
  • I can start all the services I need at once, and only when I need them. No auto-starting postgres and wasting resources if I want to watch Netflix or play a game on my desktop. Sometimes I may have 4 projects running at once, sometimes one or none.
  • Tests aren't slow at all... I cd into my project directory, start tmux so I can split yakuake into 4-5 windows in the same tab (one for docker, one for front-end, one for back-end, one for git, usually), run docker-compose up, and use that for development. I (usually) have one django+rest-framework container, one postgres, one node for dev to work on front-end and serve it, and one nginx to route calls to /api/ to DRF and the rest to Vue. Start Django and Quasar (Vue framework) in dev mode, and they monitor changes and rebuild automatically. Then I open the whole project folder in VS Code.

Workflow

Cons:

  • Occasionally, I have to restart the containers (mostly due to some exception I introduced in Django, or when making changes outside what Quasar watches, e.g. ESLint), but that is such a rare occurrence that even if it takes a minute, it's not a serious issue.
  • It does break auto-reloading, but even if it takes one second to hit F5, and I need to do it 20 times a day, the benefits far outweigh the downsides.

If you want to look at one of my dev environments in practice, I have an OSS one here - although it is missing deployment instructions.


Edit: I have fixed the auto-reloading issue in most of my projects; you just have to tell node what address to use for auto-reloading, which in quasar you can do in quasar.conf.js with something like devServer: { public = "http://localhost:<host port number>, with the port you defined in docker-compose.yml.

Collapse
 
mateusz__be profile image
Mateusz Bełczowski

Thanks for the detailed answer! I'm also a Python developer and use Pycharm as main IDE (Neovim when doing small edits).

Pycharm allows to easily select single test class or single test to run, but under the hood it just calls "docker-compose up python manage.py test path.to.test".
Having to "docker-compose up" it, slows down the whole process by at least few seconds. It does not sound like a lot, but can be inconvenient.

As an alternative, I could what you suggest - have a project running with docker-compose up and just execute single test by calling "docker-compose exec python manage.py test path.to.test".
It runs much faster, but has the downside that I have to manually type the test name (or path to it).

That's why I'm looking for something that could combine both solutions when working with Pycharm.

Do you find your workflow with tmux + VS Code convenient? Do you just switch tabs between them or keep them in separate monitors?

Collapse
 
patryktech profile image
Patryk • Edited

I only have one monitor, but I do a ton of work in the shell, so I use yakuake. I just press F12 on my keyboard, and the terminal slides down from the top of the screen, covering 90% (can easily change it). Run my commands (e.g. git commit -m && git push), press F12 again to hide it, and go back to my browser where I can monitor my Gitlab pipeline. (Or back to VS Code, or whatever I was doing).

I also use KDE plasma with virtual desktops, so I have VS Code on Desktop 2, Chrome on Desktop 1... <ctrl+alt+left> or <ctrl+alt+right> switches to the previous/next desktop, so jumping from VS Code to Chrome is also really fast...

Never tried PyCharm properly, but VS Code is really simple to configure, with a few extensions it works great (linting my Python and JS code).

I do run the test commands manually, but if they are complex enough, I can write an alias or bash script. I usually run Jest in watch mode anyway, so I run the command once when I start working, and often just let it run for days.

Pytest also has at least one watcher, but I haven't tried it. Since my terminal is practically always running, I type the command once, then my flow is:

  • Press F12 to bring up yakuake, press the up arrow to repeat the last command (or a couple of times to re-run an earlier one), and press enter. Press F12 again to hide the terminal. Sometimes I wait for the tests to finish ahead of time, sometimes I just go do my stuff and look at the results later.
Collapse
 
emmanuelobo profile image
Emmanuel Obogbaimhe • Edited

For me at my company we use it for local development because it just makes things easy. We work with distributed systems and there are different components so it's a headache to setup and build constantly.

So we containerize each component and just start and stop the containers as we need. Plus it makes on boarding new developers less of a hassle.

Collapse
 
mateusz__be profile image
Mateusz Bełczowski

That's one of the main benefits for me - it makes the process of on boarding much easier.

Collapse
 
frankfont profile image
Frank Font

Onboarding new developers was easy at my previous employer because of this. New hires up and running/compiling etc everything and anything in an hour instead of a week.

Collapse
 
thatblairguy profile image
That Blair Guy

Amen on the ease of on-boarding.

  1. Install docker
  2. clone the project from github
  3. docker-compose up

And that's it, the entire tech stack is ready to go.

Collapse
 
bradtaniguchi profile image
Brad

I'm a TypeScript full-stack developer and tried to learn how to use docker-compose+docker locally and just found a lot of pain and misery. I couldn't even think of what I could gain besides defining what node-version everything runs in and being able to "boot up" a database locally easily.

I tried to learn how to use both Docker and docker-compose from scratch and just had a tough time. Most tutorials I found go into a simple hello-world setup, or dive deep into shell scripts and commands for vastly more advanced use-cases.

I ended up giving up on using Docker locally.

Right now beyond defining just a running node version (I use nvm) I don't see much benefit in using either technology, besides being able to setup a database locally.

I'm sure I'm missing the point, but then as a "dev" that wants as little ops as possible, Docker wasn't good enough for me :'(

Collapse
 
thatblairguy profile image
That Blair Guy

Avoiding the ops is a major selling point for Docker on the desktop. Once the image (or at the very least, the Dockerfile) is created, there's no need to install anything on the local system.

I used to run VMs so I could install tools (e.g. database servers, web servers, etc.) without screwing up my primary system. With Docker, all the installation is done in a container.

Need to switch back and forth between multiple versions of node? Shut down the old container, and spin up another with the other version and you're ready to go. The time to switch is measurable in minutes and the host system is unaffected.

There is a paradigm switch, because you have to figure out what files need to live outside the container and which ones don't, but it's much lighter weight than VMs and faster than uninstalling/reinstalling every time you want to update part of your stack.

Collapse
 
moopet profile image
Ben Sinclair

My company uses an internal tool that wraps docker-compose to provide a bunch of extras, and we've moved everything across to that. Fundamentally, though, you can spin up each project using docker-compose if you want.

It's pretty good.

Because we're stuck using Macs, it's quite slow when there are a lot of files in a volume, though, maybe 10 times slower at anything I/O bound than on a Linux machine.

Collapse
 
mateusz__be profile image
Mateusz Bełczowski

Could you share example "extras" that your internal tool offer? :)

Collapse
 
moopet profile image
Ben Sinclair

We wrap things in a set of proxies that allow us to use mailhog and local development domain names (nginx-proxy, currently, but traefik also works)

I made a post a while back about extending things:


That means we have one command and it takes custom subcommands, some global and some project-specific, to do things like
  • open a browser with you logged into the website as an admin
  • switch branches and rebuild the pattern library
  • migrate data from one environment to another

That sort of thing. Just the housekeeping stuff everyone has to do, but with consistent commands between different projects which might be running different languages or frameworks.

We might open source our system at some point. That was always the intent, but nowadays there are other products which do the same sort of thing so we wouldn't be adding anything to the dev community. We're just a little tied up in using our own system

Collapse
 
therealkevinard profile image
Kevin Ard • Edited

I exclusively use docker-compose for local dev. It slows nothing down - if anything, it streamlines it because I don't have to maintain the server environments on my host; docker takes care of that. All I have locally is nvm, go, and python - just because I use those for dogfood and other general tools, though.

I use a lot of bind mounts to keep my local project in sync, and that really takes the edge off. To help that along, I've set my local uid to 1000 and I have dev images that set the container uid to the same (translation: no permissions trouble 🦄)

DC lets me do other fun things to make life good. My favorite is stacking yaml for different tasks. Base yaml + dev yaml exposes extra ports or adds dev-only tooling services; + test yaml changes db data volumes to controlled alternates.

Using rsync -a I can mirror data volumes - basically the db equivalent of git stash ♥️

With the right mix of Ansible and Make, this is all no-effort. I can remember the days before docker, but I don't like to lol. Lots of things that are no-effort now were impractical or outright impossible then.

Oh, another bonus: I have a single global/privileged stack running that has Traefik and Portainer. All running project stacks are published through Traefik, and Portainer is just nice to have around. No port conflicts, and life is easy ♥️

Collapse
 
mateusz__be profile image
Mateusz Bełczowski

I haven't heard about Portainer before - thanks for sharing :)

Collapse
 
thedubcoder profile image
w

I have been full blown docker for local development (consultant, 3-8 person dev teams) and overall it works. Some things aren't the most straight forward, but overall being able to docker-compose up and have an entire dev env just work is worth it.
On thing I am starting to look into is Google's new Skaffold project which is more kubernetes style approach but claims to make the tech in that are more dev accessible

Collapse
 
supermanzer profile image
Ryan Manzer

I use docker and docker-compose for local development all the time.

I'm usually doing a web project and I've got a few docker-compose.yml templates that outline tech stacks that I commonly use. That way I can easily spin up isolated environments with control of the versions of all stack components.

I find it useful but I might be consuming more resources than necessary.

Collapse
 
andrewpmiller profile image
Andrew Miller

We have a Jekyll-based thing and there are at least three of us who work with it. Rather than deal with the hassles of installation of Ruby and gems etc, I‘ve set up a very simple Docker Compose configuration that lets anyone develop (or review a PR) locally with a single command. It’s been an absolute godsend. This is far from all the magic that Docker can offer for local development, but the elegance and simplicity of docker-compose up can’t be beat.

Collapse
 
rakeshkv profile image
Rakesh kv • Edited

I have docker only on dev and production built.
If something wrong on the dev built.
We'll try to check in our local build to fix and again built in dev.
Docker on local machine is painful. Not all the Dev's are familiar with.
Using docker is overwhelming but it's not necessary.

Collapse
 
rhlsthrm profile image
Rahul Sethuram

Yep! We have a really cool devops framework that one of our team members built that consists of makefiles and uses docker compose in dev and prod. Our project is fully open source: github.com/ConnextProject/indra!

Collapse
 
andreidascalu profile image
Andrei Dascalu

I never understood those people saying that the setup is more complicated. It's always the exact same setup you'd make locally, but you script it in a Dockerfile so that it can be shared across developers in the project so that environment parity is ensured. Your container can inherit production (or the other way around, depending on what you're trying to achieve).
For my PHP projects, I never had issues with changes in real time when using docker-compose. The setup is simple: base php image + extensions (and some optimizations) for prod, prod image is inherited by a dev image (xdebug and whatnot). Since I work across several projects (and advise on some), I am daily switching envinronments.
A simple "docker-compose up -d" and I have project A (with php7.1, mySQL and whatever). 10 minutes later I may be on project B (with php7.3, MariaDB, Rabbit and whatnot). Doing the same in a local environment? Are you kidding?
And for Golang, I don't build containers. I have containers for some dependencies, using local ports, but then I build locally ( still using Docker though, basic alpine image with Go and simply build for Darwin architecture).

Collapse
 
sonnk profile image
Nguyen Kim Son

In my team we have started going “full” docker at first: the local environment can be run in a big docker-compose file. The host code repository is mounted in docker so we can use IDE comfortably.

1 or 2 years later we decide to use a somewhat “hybrid” setup with only some complicated-to-setup components run inside Docker. The main reason is performance and battery: file syncing between Mac-docker has always been a weak point and Docker consumes battery as crazy (compared to the same code run on host). Even if we usually have the power plugged in, this is not good for environment generally 😕.

Collapse
 
mateusz__be profile image
Mateusz Bełczowski

Yes, I feel that pain, so I'm also thinking of using the hybrid approach. I want to "docker-compose up" things like database, queues, services that do not require frequent code changes and leave the rest on the host for simplicity :)

Collapse
 
purezero profile image
Kartika Prasad

I think the biggest advantage with using tools like docker is the what appears to be seamless changes in infrastructure passed down to the dev. Say if someone in your team upgraded the version of java on the server you were using you wouldn't notice but if you were using removed functionality you would notice some your server code no longer works. This saves you deploying code to higher envs and finding issues there

Collapse
 
scycer profile image
Daniel Hoek

The issue you list are interesting as it seems while using docker your making a change, rebuilding the container then rerunning and running the tests etc which would take agers for sure!

I use it as a tool to standardise the "environment" to develop in so it has the same OS that builds for prod, same dependencies, same install script, same build script etc but once the container is up and running i work inside it, changing code and hot reloading without any slowdown from simple local development. It's not going to be an exact replica of a full build but close enough, then I just have the ci/cd take care of a proper build/test/deploy cycle.

Collapse
 
capdragon profile image
CaptDragon

I typically have a Dev docker-compose yaml. I mount a local volume to get live edits and not have to rebuild the image every time from the Dockerfile. I works great and beats having to install and configure my local machine x number of ways.

Collapse
 
mateusz__be profile image
Mateusz Bełczowski • Edited

What editor/IDE are you using?

Collapse
 
tracker1 profile image
Michael J. Ryan

Personally, I like it... I can have a software stack in the background while I work on one piece locally... Spinning up and down databases and various configurations are much faster.

Collapse
 
mateusz__be profile image
Mateusz Bełczowski

What about the things like the module/service/app that you're actively developing? I agree, using it for databases, queues etc. is very useful :)

Collapse
 
niklasmtj profile image
Niklas

I use Docker to set up quick databases for my development and testing. I mostly use docker-compose.yml files to set it up properly and don't have to search for the right docker command with all the flags when I come back to development a couple weeks later. Just start the docker-compose file and here we go.

Tell me I'm wrong but isn't it possible to mount a directory into the container which also updates inside the container when there are changes outside of it on the local machine?

Collapse
 
mateusz__be profile image
Mateusz Bełczowski

Yes, totally, it's possible, but not always as easy as it might be (at least in my opinion) :)

Collapse
 
patricnox profile image
PatricNox

I use docker for the following reason and that only:

Development on a cloned production server.

Same libraries, php extensions, OS behavior, Database server, versions, etc.

I want to eliminate the possibility of ever getting the reason of server difference to be the cause of a bug.

Collapse
 
danggrianto profile image
daniel anggrianto

If you are a single developer, on a single (or similar setup) project, it will be more work to setup docker. I have been using docker heavily as qa engineer since i need to pull different services to work with, but on some on my personal project i don't really use them.

Collapse
 
frankfont profile image
Frank Font

Docker on Linux developer desktop? It's a terrific experience and not slowed at all.

On Mac? Yeah, overhead kicks in hard. Depending on the tooling might still be worth it.

Windows? Unless using vscode remote container integration it's possibly not worth the trouble.

If you are on Linux or Mac and like command line control of docker, check out github.com/bah-insignia/zcmd

Collapse
 
xowap profile image
Rémy 🤖 • Edited

Every time I want to try Docker I'm reminded how much more complicated it is in comparison to a Python virtual environment/npm install.

Collapse
 
capdragon profile image
CaptDragon

It all seemed complicated for me at first as well, then it became easy as I "got it" and the value was realized.

Collapse
 
xowap profile image
Rémy 🤖

It's just a wrapper around cgroups, I had the idea of Docker before it existed. But my point is that you can't really use it to do local development since the files are owned by root (or then you need complicated scripts) and you can't really use it in production because on itself it's just a process launcher and what you really need is Kubernetes and Kubernetes is a fucking complexity hole. Also it makes the IDE setup very complicated and adds a lot of moving parts.

I have neat Ansible scripts that do everything for me, I work with simple VMs and all is nice.

If I ever use Docker it'll be to specify my runtime requirements to a PaaS, but for local development there is really no point.

Although that's simply my experience, what value did you find?

Collapse
 
mateusz__be profile image
Mateusz Bełczowski

Yes, sometimes it's just not worth the effort (at least for me)

Collapse
 
gergelypolonkai profile image
Gergely Polonkai

Vagrant can solve your problems (and can use Docker under the hood).

I use Vagrant for a while, because

  • with that i can easily run only part of the environment
  • the work tree files can be mounted in all vagrant boxes (VMs or containers), so i can change things instantly
  • with some app settings magic i can use either a box for the database, or just use the DB running directly on my machine.

And the list can go on. It has some overhead if you use it only for development, though. We use the same vagrant config for integration tests, too.

Collapse
 
brijesh_k_sriv profile image
Brijesh Srivastava

Using Volume in Docker makes life easier and will enable feature like devtoolbox .

Collapse
 
atilafullstack profile image
Atila Augusto 🛡️

I tried using docker but my disc espace always fully, so i stopped.