In my life, I watched a lot of tutorials about Celery & Django configuration but I was always getting some errors.
Today I am gonna show you the easy way to add Celery and Redis to your Django project.
This tutorial is based on one of my projects available on GitHub MyFridge.
About Celery and Redis
Celery
Celery is a distributed task queue system that allows you to run time-consuming tasks asynchronously. With Celery, you can decouple long-running or resource-intensive processes from your main application, making your application more responsive and efficient. Celery supports various message brokers like Redis, RabbitMQ, and more, but in this article, we'll focus on using Celery with Redis.
Redis
Redis is an open-source, in-memory data structure store that can be used as a message broker for Celery. It excels at handling high-throughput, low-latency tasks and provides persistent data storage. In the context of Celery, Redis acts as a message broker, passing messages between the main application and the worker processes that execute tasks in the background.
Final project structure (this structure includes other content, we gonna focus only on a few elements)
As you can see my project in Django is called - myfridge but yours probably will have another name so you need to change all 'myfridge' to your project name (inside files:
- celery.py
- Dockerfile
- docker-compose.yml )
Installation and configuration
- Install packages
First, you need to install celery and Redis packages
pip install celery Redis
Add this packages to requirements.txt
pip freeze > requirements.txt
- Create a celery.py file inside your Django project (the same place where settings.py is)
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from Django.conf import settings
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "myfridge.settings") # Change 'myfridge' to your project name
app = Celery("myfridge") # Change 'myfridge' to your project name
app.config_from_object(settings, namespace="CELERY")
app.autodiscover_tasks()
- Configure __ init __ .py In the same directory where you created celery.py, there is a file called '__ init__ .py' You need to paste this code inside this file
__ init __ .py
from .celery import app as celery_app
__all__ = ("celery_app",)
- Edit settings.py Inside settings.py file you need to configure your Celery broker, result backend, and other important things. For me, it looks like this
settings.py
CELERY_BROKER_URL = "redis://redis:6379/0"
CELERY_RESULT_BACKEND = "redis://redis:6379/0"
CELERY_ACCEPT_CONTENT = ["application/json"]
CELERY_TASK_SERIALIZER = "json"
CELERY_RESULT_SERIALIZER = "json"
- Docker configuration Now to run Celery, redis, and Django we need to use Docker.
First of all, in your main directory create a 'config' directory and inside it create a directory called 'Django' and then inside it create a file called Dockerfile Note that D must be an uppercase
Dockerfile
FROM python:3.11
ENV PYTHONUNBUFFERED 1
WORKDIR /app
COPY requirements.txt.
RUN pip install -r requirements.txt
COPY ./myfridge /app/ # Change 'myfridge' to your project name
Then inside the main directory (where config and Django project is) create a file called docker-compose.yml
In my example it's called docker-compose.dev.yml but you need to name it docker-compose.yml
docker-compose.yml
version: "3"
services:
db:
hostname: db
image: postgres:15
container_name: myfridge-db-dev
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=myfridge
- POSTGRES_USER=myfridge
- POSTGRES_PASSWORD=myfridge123
ports:
- "5432:5432"
web:
hostname: web
build:
context: .
dockerfile: ./config/django/Dockerfile
container_name: myfridge-web-dev # Change 'myfridge' to your project name
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./myfridge:/app # Change 'myfridge' to your project name
ports:
- "8000:8000"
depends_on:
- db
- redis
- celery
env_file:
- .env
redis:
image: redis:latest
container_name: myfridge-redis-dev # Change 'myfridge' to your project name
ports:
- "6379:6379"
celery:
build:
context: .
dockerfile: ./config/django/Dockerfile
container_name: myfridge-celery-dev # Change 'myfridge' to your project name
command: celery -A myfridge.celery worker -l info # Change 'myfridge' to your project name
volumes:
- ./myfridge:/app/ # Change 'myfridge' to your project name
depends_on:
- db
- redis
And thats all for our configuration, note that you need to change all 'myfridge' to you project name
To run Django project with Docker you need to use PostgreSQL database
Check this tutorial Docker With Django And PostgreSQL Tutorial
Example 1
- Inside your django application create a file called task.py
task.py
from celery import shared_task
@shared_task()
def add(x, y):
return x + y
As you can see to make a task you need to add @shared_task() decorator to your function
- Call this task inside your view
views.py
from .task import add
def add_view(request):
result = add.delay(4, 6)
return HttpResponse(result)
To run 'add' function as a task add .delay() method
Example 2
- Inside your django application create a file called task.py
- Create some function that should do somethings
send_email.py
def send_email(email, subject, message):
print(f"Sending email to {email}")
Of course this function doesn't do anything but it's just an example
- Then in task.py
task.py
from .send_email import send_email
from celery import shared_task
@shared_task()
def send_email_task(email, subject, message):
send_email(email, subject, message)
Before our function task, we need to add the decorator
@shared_task()
And then inside that task call our function
- Running a task You can you this task inside your forms, views, commands, etc In this example, I am gonna use it inside the registration form This code is not import for you, you don't need to understand it.
forms.py
from django import forms
from django.contrib.auth.forms import AuthenticationForm
from .models import CustomUser
from .task import send_email_task
class CustomUserRegistration(forms.ModelForm):
username = forms.CharField(widget=forms.TextInput(attrs={"class": "form-control"}))
email = forms.EmailField(widget=forms.EmailInput(attrs={"class": "form-control"}))
password = forms.CharField(
widget=forms.PasswordInput(attrs={"class": "form-control"})
)
password_repeat = forms.CharField(
widget=forms.PasswordInput(attrs={"class": "form-control"})
)
class Meta:
model = CustomUser
fields = ("username", "email", "password", "password_repeat")
def clean_email(self):
email = self.cleaned_data.get("email")
if CustomUser.objects.filter(email=email).exists():
raise forms.ValidationError("Email already exists")
return email
def clean_password_repeat(self):
password = self.cleaned_data.get("password")
password_repeat = self.cleaned_data.get("password_repeat")
if password != password_repeat:
raise forms.ValidationError("Passwords don't match")
return password_repeat
def send_email(self, message):
send_email_task.delay(
self.cleaned_data.get("email"),
subject="Activate your account",
message=message,
)
Take a look at the last method 'send_email'
Inside that method I am calling for our task and to run it I am using .delay method
This allows us to run it as a celery task
views.py
class RegisterUserView(FormView):
form_class = CustomUserRegistration
template_name = "register.html"
success_url = reverse_lazy("users:success_register")
def form_valid(self, form):
user = form.save(commit=False)
user.is_active = False
user.set_password(form.cleaned_data["password"])
user.save()
form.send_email(
message=render_to_string(
"acc_activate_email.html",
{
"user": user,
"domain": get_current_site(self.request).domain,
"uid": urlsafe_base64_encode(force_bytes(user.pk)),
"token": account_activation_token.make_token(user),
},
)
)
return super().form_valid(form)
Inside RegisterUserView I am using this form and I am able to use the method send_email
Top comments (0)