In this step-by-step article you will learn how you can set up a local development environment with Django and with PostgreSQL service using Docker.
What is Docker?
Docker is a way to create an isolated environment for your software, with the growing number of variables within a software such as operating systems, versions, dependencies there is need of virtualization so that a software can run on multiple systems without any problem. This is where Docker comes in, Docker virtualizes the application layer of the OS so that every application can run on this environment, The virtual environment is referred to as Docker Container. So with Docker we only need a configuration file and Voila you can launch your software on any system, this configuration file is called Docker Image.
Why Docker?
But why do I need to use Docker to create a virtual environment? Python already has venv
andpipenv
. The answer is that these virtual environments can only create environments for Python packages, not environments for non-Python packages such as PostgreSQL. In this tutorial, we will use pipenv to track Python dependencies.
Docker: Installation
Docker is available for Windows, Mac and Linux you can download it from the official site. After downloading and installing the setup file you can confirm your installation by running docker --version
on your terminal/command prompt, which will output something like this.
Docker version 20.10.12, build e91ed57
Setting up Python Project
After installing Docker, create a virtual environment for your Python packages using pipenv
. Install pipenv by running the below command. We have to use pipenv to package our Python dependencies.
pip install pipenv
Create a new directory anywhere on your PC. Set up your Python project here and open a command prompt in this folder. In the example below, we have created a folder named test
and moved our command line to this folder.
Run the following command to create a Virtual Environment and install django
using pipenv.
pipenv install django
You should see a Pipfile
and 'Pipfile.lock' file in the directory. To enter this virtual environment, simply run pipenv shell
. Then you should see something like this:
The (test-xxxxxxx)
shows that you are in the virtual environment.
Now run django-admin startproject newproj
to create a new Django project in this environment. There is a dot (.) at the endwhich tells django that it should create a new project in the current directory.
Now run python manage.py startapp newapp
to create a new Django app in this project. In order for this application to be discoverable in your project, you must add it to the INSTALLED_APPS setting in your newproj/settings.py file.
...
# Application definition
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'newapp',
]
...
Before jumping into docker make sure that everything is working by running python manage.py runserver
and goto, http://127.0.0.1:8000/
This screen should open up, this means your installation is working.
This is how our project looks like at this point:
D:\WORK\PY\TEST
│ db.sqlite3
│ manage.py
│ Pipfile
│ Pipfile.lock
│
├───newapp
│ │ admin.py
│ │ apps.py
│ │ models.py
│ │ tests.py
│ │ views.py
│ │ __init__.py
│ │
│ └───migrations
│ __init__.py
│
└───newproj
asgi.py
settings.py
urls.py
wsgi.py
__init__.py
At this point, you're ready to set up your Docker image, but before that, make sure you're NOT in the pipenv environment. Run exit
to exit the environment.
Set Up Docker
What is Docker Image:
A Docker image is like a blueprint and tells Docker how to build an application, where its dependencies are located, and the commands to run to build the application. Now create a "Dockerfile" in the same directory where your Pipfile and Pipfile.lock are located and write the following command to the file:
FROM python:3.9
ENV PYTHONDONTWRITEBYTECODE 1
WORKDIR /project
COPY Pipfile Pipfile.lock /project/
RUN pip install pipenv && pipenv install --system
COPY . /project/
This is a Docker image. Let's go through each line and see what it means.
FROM python:3.9
> FROM command is used to get the base image from Docker Hub. Since we are creating a Django project, we need a base Python image as Django is built on top of Python. If you are using Jenkins written in Java, you need to import openjdk. Replace 3.9 with something else and you can install any version of Python.
ENV PYTHONDONTWRITEBYTECODE 1
> Environment variable that tells Docker not to write cache files (.pyc), which are Python bytecode files, usually stored in the __pycache__
folder.
WORKDIR /project
> WORKDIR is used to set the working directory of the Django project where project files will be stored.
COPY Pipfile Pipfile.lock /project/
> The project's dependencies are stored in the Pipfile, so you need to copy them to your working directory (/project/ in this case).
pip install pipenv && pipenv install system
> This command is self-explanatory and uses pipenv
to install all dependencies on your system.
COPY . /project/
> Instruct Docker to copy all files and folders (source code) to a WORKDIR named /project/.
Now it is time to build out Docker Image, on the terminal run the following command.
Now we have to build our docker image using the Dockerfile, from the following command
docker build .
Here's what Docker does after running the above program:
- Creating a Linux Environment
- Install Python as an environment from Docker Hub
- Install all dependencies on the system
- Copy all code from the directory to the Linux directory
To check this, VS Code provide a very useful Docker inspection tool to view the Linux directory structure of a Docker container. If you have a Docker container you can see this.
You have now created your Docker image. Now we need a docker-compose.yml
file to control how the container is started.
Make a new file with name docker-compose.yml
and write the following:
version: '3.8'
services:
web:
build: .
command: python /project/manage.py runserver 0.0.0.0:8000
volumes:
- .:/project
ports:
- 8000:8000
Let's analyze the above file line by line. The first line tells us which Docker version we want to use.
Next, we define the services we want to run in the container. In this case, we want to use only one web service. It then tells Docker how to create the container and tells it to literally start the Django server by looking at the current directory (.) and running the runserver command. Volumes is Docker's data persistence mechanism and is used to keep Docker filesystem data synchronized with local directories. 'ports' specifies the ports that this particular web service can use.
docker-compose.yml job finished. Now run the docker-compose up
command to start the service.
Now goto, http://127.0.0.1:8000/ if the page is working then congratulations, you just started your first Docker Container.
You can see in the below picture that we can now inspect our project files in the Docker Linux Container using VS Code
Adding PostgreSQL
Django comes with a built-in SQLite database, which is not recommended for production applications. For production, you need a powerful database like PostgreSQL. In this next part we are just going to do that.
To use PostgreSQL, you need to:
- Install psycopg2 so that Python can talk with PostgreSQL.
- Create a PostgreSQL service in Docker with the required environment variables.
- Use volumes to persist data even if service is interrupted
- Edit the settings.py file in your Django project to change DATABASE type.
So, let's begin. To install psycopg2, you must first stop the docker container using the docker-compose down
command.
To install the PostgreSQL adapter for Django "psycopg2", enter the command pipenv install psycopg2
. We are installing the adapter using 'pipenv' rather than 'pip' to make it usable in a docker environment that installs all dependencies for pipfile.lock.
After the installation is complete, you need to create a service for PostgreSQL in the dockercompose.yml
file. Open the file and type:
version: '3.8'
services:
web:
build: .
command: python /project/manage.py runserver 0.0.0.0:8000
volumes:
- .:/project
ports:
- 8000:8000
depends_on:
- db
db:
image: postgres
environment:
POSTGRES_PASSWORD: postgres
volumes:
- postgres_data:/var/lib/postgresql/data/
volumes:
postgres_data:
In the above file here is what we have done,
We created a new service called db, this service will pull PostgreSQL image from the Docker Hub,
We setup environment PostgresSQL password which we will use in our settings.py file
volumes are the location where all the data is stored
volumes outside the db service indentation tells the docker that we want to persist the
postgres_data
locally even when we stop our docker container.we also made changes in web service by adding
depends_on
which tells docker that to start the db service before web service.
Now the only thing that is left to do is to tell our Django project that we have a postgreSQL db service and we want to use this database not SQLite database.
Open newproj/setting.py file and make the following changes in the DATABASES settings (line 77)
....
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD':'postgres',
'HOST':'db',
'PORT':5432
}
}
....
First we have to change our ENGINE to 'django.db.backends.postgresql', the NAME and USER settings can be of your choice, in the PASSWORD we have to put the password we defined in the docker-compose.yml
, the HOST is your service name which is db and in the PORT we have to provide 5432 which is PostgreSQL default port.
All done, now let's start our docker container using the following command
docker-compose up -d --build
Note that we use -d flag which stands for detach mode it will be useful when we want to execute some commands in the same terminal and --build flag is used because we made changes in our Pipenv file that is why we are telling Docker to build the Container from scratch.
After running the above command goto http://127.0.0.1:8000/ , if that opens up then that means your db and web service are working.
Execute commands from docker
You may have already noticed that you can't run commands like migrate and createsuperuser using the command line, because you're getting the error "Could not resolve hostname 'db' to address: unknown host".
PostgreSQL databases can only be used in Docker containers, so all database-related commands must be run in the container.
Migrate Database
Go to the same terminal and run the following command, where we run the previous command to start our container.
docker-compose exec web python manage.py migrate
Create Super User
To test if our database is working correctly, let's create our database and login the user from the localhost
docker-compose exec web python manage.py createsuperuser
Type the Username, Email and Password and go to Django Admin to see if our Admin Site is working.
If this window opens up then congratulation you have successfully created a Django/PostgreSQL environment using Docker
Conclusion
Docker is far more vast then this, if you want to learn about Docker in detail then I would recommend the following resources.
GitHub Code Repository for the above article is available here.
If you liked the article then also consider checking me out on Twitter where I post stuff like this in under 280 characters daily.
If you are facing any problem in following up with this article then please feel free to ask in the comments or on Twitter
Top comments (0)