We all need data.
Especially when running code against local environments that in Production would be running against a back end database. When I was building up some local Dockerized applications (Tomcat and MySQL) I wanted to have my UI showing some data.
This was partly to know things were up and running, since our main page makes a call against the Database to display some information on the initial page. Data that lives in the backend, or in this case the MySQL container. I thought about how I might have to add start up scripts, or maybe set up a MySQL database already loaded, but then I found about Docker's entrypoint initdb.d and that solved my problem easily.
COPY deployment/docker/mysql/very-small-db.sql /docker-entrypoint-initdb.d
This takes my database dump, which I made small so all it needs is my schema and maybe a couple of days worth of data, and initializes my database with it when I start up the container. This way I can fiddle with the dataset as I want and rebuild the container and then be on my way.
The great part about using tools is when you suddenly find a new feature, or way to use them.
Top comments (5)
For larger datasets container launch will be really slow though. What we did was use a multistage docker image; first one imports the SQL files and launches the database which load them (slow), second part then copies the data dir to a fresh mysql image.
I haven't done a large set, the full database of what I could build is around 2GB - not large but too much to move around in a repo. I can imagine how long that would take in docker, with MySQL Workbench it takes 20 minutes to complete. I'll have to keep your method in mind if I have to do a full restore
It should help some custom automatic functional tests as well.
I wonder what other DB dockers have this function.
This is sort of what I use it for, but its also good to at least do some exploratory testing when checking builds.
Wow this is so close to what I just posted. I'll be reading all comments now and in the future.