DEV Community

Cover image for Python Background Tasks
Bearz
Bearz

Posted on

Python Background Tasks

Introduction

Some months ago ive deployed a Discord Bot which sends a Welcome Gif image whenever a new user joins but there is a problem with that...

Im saving the image in my server in case i would use it for something else later but for now if i want to add my bot to more servers, it will eventually hard to delete all those files by myself, so lets fix that.

Doing some research i found Celery, which is a great tool mainly used to do some background jobs and allows us to schedule tasks.
Celery communicates via a Message Broker to save those tasks (serialize them), and call them at any moment with Celery (The Worker), for this time we are going to use Redis as the broker.

Redis is basically like LocalStorage for your Browser but in your computer (server).

Prerequisites

  • For this article we will use Redis but the Docker Image, so make sure you have Docker installed.

Lets start!

Now that we have docker install, lets download the Redis image an run it in a container with this command:

sudo docker run -p 6379:6379 --name redis-cont -d redis

You can use sudo docker ps -a to check all containers runing in your machine, make sure there is one called redis-cont.
check docker process

Installing python dependencies

You can use a virtual enviroment to install the following dependencies:

celery==5.2.7
Flask==2.1.3
flower==1.0.0
redis==4.3.3
Enter fullscreen mode Exit fullscreen mode

Celery and Flask

This is an example of how to call functions in the background with flask.
What im gonna be doing in the background function is deleting all files in the directory "./data", that cointains some images.

Project tree

# main.py
from flask import Flask, request
from celery import Celery
import os
import time

# Adding the required settings in order to connect our flask app with celery and redis
app = Flask(__name__)
app.config["CELERY_BROKER_URL"] = "redis://localhost:6379/0"
# Using the route "/0" to send the message to redis
app.config["result_backend"] = "redis://localhost:6379/1"
# Using the route "/1" to save the result-object we return in the background task 


celery = Celery(app.name, broker=app.config["CELERY_BROKER_URL"])
celery.conf.update(
    result_backend=app.config["result_backend"]
)

@celery.task()
def task_example():
    # Sleeping the function for a while
    time.sleep(25)
    print("--------> Deleting all files in ./data directory <---------")
    for _file in os.listdir('./data'):
        os.remove(f"./data/{_file}")
    # This object will be stored in redis as the **result_backend** we passed above
    return {
        "state": "Done",
        "developer": "Bearz",
        "status": 200
    }

@app.route('/', methods = ['POST', 'GET'])
def main():
    # The 'delay()' method will send the function to a celery worker to be executed
    task = task_example.delay()
    return "GG BRO"


if __name__=='__main__':
    app.run(debug=True, port=5000)
Enter fullscreen mode Exit fullscreen mode

Great now we can start the flask application by calling:
python main.py

In order to celery start doing his job we need to execute this command in a separete terminal:

celery -A main.celery worker --loglevel=info

This will start a worker to handle all background tasks and calling them in the right time.

There is something to note in the above command in the main.celery, if your file is named app.py you have to pass app.celery

Great!, now we have 2 terminals, one running our Flask app and the other one is runing a Celery worker, lets test if everything is working fine, visit the url: **http://localhost:5000/**

Our delay() method its gonna be triggered and send the task function to the worker.

Monitoring tasks

Cool, we have created our first background tasks, but how do we monitor them ?

We can use a Flower which is a web app that connects to our Celery to keep track of every single task.

pip install flower==1.0.0

And start it

celery -A main.celery flower --loglevel=info --port=9999

Now if you go to localhost:9999 you can see a pretty GUI where if you go to the Task Tab you can view every task and which state they are currently in.
Monitorin tasks with flower

Scheduling tasks

There is still one thing to do, how do we schedule a task to be executed every hour/minutes/seconds ?

Celery Beat allows us to do that, we can eve just create tasks without using flask.

We have to add some more things to our main.py file to schedule tasks:

# main.py
# ...
CELERY_BEAT_SCHEDULE = {
      'do-this-every-5-minutes': {
        'task': 'main.task_example',
        'schedule': 300.0,
    },
    'do-this-every-1-minute': {
        'task': 'main.task_per_minute',
        'schedule': 60.0,
    },
}

celery = Celery(app.name, broker=app.config["CELERY_BROKER_URL"])
celery.conf.update(
    result_backend=app.config["result_backend"],
    beat_schedule=CELERY_BEAT_SCHEDULE
)

@celery.task()
def task_per_minute():
    print("--> This message should be printed every minute <--")
    return {
        "task_returned": True,
        "developer": "Bearz"
    }

@celery.task()
def task_example():
    time.sleep(25)
    print("--------> Deleting all files in ./data directory <---------")
    for _file in os.listdir('./data'):
        os.remove(f"./data/{_file}")
    return {
        "state": "Done",
        "owner": "Elmer&Hachy"
    }

Enter fullscreen mode Exit fullscreen mode

Ok, we pass a new parameter to celery.conf.update which is beath_schedule, here we pass a Dictionary with all our tasks that we want to schedule.
Eveyr single key in that Dictionary need to have a new Dictionary atleast 2 import keys:

  • task: Refers to the function we want to execute
  • schedule: How often the task should be called, the time is in seconds.

As Celery Beat is another service of celery, we have to open a separate terminal to start the service

celery -A main.celery beat

And the Tasks will be called in the time you have given.

Celery Beat will only kick off tasks at regular intervals, and the Celery Worker will process them.

Take a look at the Flower service started earlier.
Flower service

Amazing, we now know who to create background tasks for specifics purposes.

That would be all for today, thanks for reading and see you in the next posts, have a nice day ;)

Here is the Git repo:
https://github.com/AlonsoCrag/python-celery

Note
Redis Commander is an alternative to Flower, it can be installed through npm.

Top comments (0)