DEV Community

Cover image for Setting up the database - Dockerizing Django for deploying anywhere !
Ashiqur Rahman
Ashiqur Rahman

Posted on • Edited on

Setting up the database - Dockerizing Django for deploying anywhere !

When it comes to setting up the database for your django app, there are quite a number of options. For starters, you need to choose which database you are going to use. Django comes with sqlite3 by default. For some very small to mid scale applications sqlite3 might just do it. So, i tried to integrate sqlite3 with docker first. But, sqlite3 isnt a real database, is it ? :3 So yeah, i could not find a decent way to do that. Thats why (and many other reasons) i leave out sqlite3 when im thinking deployment. My preferred options are either mysql or postgresql. In this tutorial, i will demonstrate setting up both of them.
The second option you would have to consider is between choosing a database service (like aws RDS) or deploying the database on your own. Now there are some pros and cons to each option. If you want someone to take care of most things (like configuring, maintaining, and securing) for you, just go with RDS. On the other hand, if you want more control and flexibility, you can just setup your database inside, say an EC2 Instance. If you want to go with RDS, setting up RDS is really easy. Just Check This Out In this tutorial though, i wanted to explore something different. Hence, i would show you how to setup the database on your own and ofcourse dockerize stuff to make things nicer as always 0:)

Updating Django App Dockerfile

Basically we need to install some dependencies in our debian container.
For,

  • Postgres: postgresql
  • MySql: default-libmysqlclient-dev So, we need to update this line in our Django App Dockerfile like so:


RUN apt-get install -y --no-install-recommends gcc libc-dev python3-dev [ default-libmysqlclient-dev or postgresql ]


Enter fullscreen mode Exit fullscreen mode

So our Dockerfile looks sth like:



FROM python:3.8-slim-buster

ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
ENV PATH="/scripts:${PATH}"

RUN pip install --upgrade pip
COPY ./requirements.txt /requirements.txt
# packages required for setting up WSGI
RUN apt-get update
RUN apt-get install -y --no-install-recommends gcc libc-dev python3-dev default-libmysqlclient-dev

RUN pip install -r /requirements.txt

RUN mkdir /app
COPY ./src /app
WORKDIR /app
COPY ./scripts /scripts

RUN chmod +x /scripts/*

# folder to serve media files by nginx
RUN mkdir -p /vol/web/media
# folder to serve static files by nginx
RUN mkdir -p /vol/web/static

# always good to run our source code with a different user other than root user
RUN useradd user
RUN chown -R user:user /vol
# chmod 755 means full access to owner and read-access to everyone else
RUN chmod -R 755 /vol/web
RUN chown -R user:user /app
RUN chmod -R 755 /app
# switch to our user
USER user

CMD ["entrypoint.sh"]


Enter fullscreen mode Exit fullscreen mode

Updating docker-compose

Now, we need to create a new service for our db in our docker-compose. Like we did for our static data, we need to also persist our database beyond the life of our container. Therefore, we need to use a volume for our db as well. Lets call it, production_db_volume. We need to choose the docker image for our service as well. In this tutorial, i chose one thats relatively newer but also has a long term support.
You can find the available options here:

So, our docker-compose becomes sth like:



version: '3.7'

services:

  db:
    image: mysql:5.7
    ports:
      - "3306:3306"
    restart: always
    volumes:
      - production_db_volume:/var/lib/mysql
    env_file:
      - .live.env

  # or
  #  db:
  #    image: postgres:12.5
  #    ports:
  #      - "5432:5432"
  #    restart: always
  #    volumes:
  #      - production_db_volume:/var/lib/postgresql/data/
  #    env_file:
  #      - .live.env


  app:
    build:
      context: .
    ports:
      - "8000:8000"
    volumes:
      - production_static_data:/vol/web
    restart: always
    env_file:
      - .live.env
    depends_on:
      - db

  proxy:
    build:
      context: ./proxy
    volumes:
      - production_static_data:/vol/static
    restart: always
    ports:
      - "80:80"
    depends_on:
      - app

volumes:
  production_static_data:
  production_db_volume:



Enter fullscreen mode Exit fullscreen mode

Notice, we have mentioned the .live.env file for our db service. Our db containers need some environment variables initiate properly, we would provide them inside our .live.env.

MySql Environment Variables



MYSQL_DATABASE=blah_blah_bleh
MYSQL_ROOT_PASSWORD=blah_blah_bleh
MYSQL_USER=blah_blah_bleh
MYSQL_PASSWORD=blah_blah_bleh


Enter fullscreen mode Exit fullscreen mode

Postgres Environment Variables



POSTGRES_USER=blah_blah_bleh
POSTGRES_PASSWORD=blah_blah_bleh
POSTGRES_DB=blah_blah_bleh


Enter fullscreen mode Exit fullscreen mode

Notice, we are re-using the same .live.env file we used for our 'app' service. It is because we will need the 'db' environment variables in our django app settings.py as well.

Updating settings.py



# for mysql
DATABASES = {
        'default': {
            'ENGINE': 'django.db.backends.mysql',
            'NAME': os.environ.get("MYSQL_DATABASE"),
            'USER': os.environ.get("MYSQL_USER"),
            'PASSWORD': os.environ.get("MYSQL_PASSWORD"),
            'HOST': 'db',  # docker-compose service name 'db' resolves to host name 'db'
            'PORT': 3306
        }
    }
# or, for postgres
DATABASES = {
    "default": {
        "ENGINE": 'django.db.backends.postgresql',
        "NAME": os.environ.get("POSTGRES_DB"),
        "USER": os.environ.get("POSTGRES_USER"),
        "PASSWORD": os.environ.get("POSTGRES_PASSWORD"),
        "HOST": 'db',
        "PORT": 5432,
    }
}


Enter fullscreen mode Exit fullscreen mode

you get the idea !!

Finally, we need to update our django app requirements.txt.

Updating requirements.txt



# for postgres
psycopg2-binary==2.8.5 


Enter fullscreen mode Exit fullscreen mode

or



# for mysql
mysqlclient==1.4.6
django-mysql==3.8.1


Enter fullscreen mode Exit fullscreen mode

Now, bringing down the existing containers and their volume:
docker-compose down -v
and just executing a docker-compose up -d --build should setup your db container, run the migrations (since, we have put the *manage.py migrate command in our docker entrypoint in our earlier tutorial).
And thats it for the database setup, in the next tutorial we will work on securing our deployment with ssl, certbot and other django security checks.

If you found this post helpful, CLICK BELOW 👇 Buy Me A Beer

Top comments (0)