DEV Community

Cover image for How to Use Google Cloud Run Jobs for Background Tasks
mkdev.me for mkdev

Posted on • Originally published at mkdev.me

How to Use Google Cloud Run Jobs for Background Tasks

A few years ago, Google announced Cloud Run Jobs. After months of testing and proving that everything was satisfactory, they announced last year that it is now a publicly available GA application. But what is a Cloud Run Job, and how does it differ from Cloud Run?

Cloud Run services are essentially web applications that are always 'listening' for incoming requests, commonly known as a 'call' in this context. This means they are designed to respond whenever they receive a certain type of request from another service or a user. For example, if you have a Cloud Run service set up to process image data, it will remain idle until it receives an image to process. Once it receives this 'call,' it will carry out its programmed function, like image resizing, and then return to its idle state, ready for the next request.

But now, Cloud Run Jobs change the rules. For instance, they don't need to be web applications, and even better, those jobs are not listening. As soon as these jobs are executed, they are done.

So, why would I need to have a Cloud Run Job? There are many reasons:

Imagine you have a web application where you upload a video file and you want to convert it from your MP4 into different video formats. If you try to do this with Cloud Run, you'll encounter a problem because you can't use parallel processing. But now, you can.

You can call from your Cloud Run to a different number of Cloud Run Jobs (for example, 100) that will each take a part of the video and work on that part.

But how can I do that? There are two variables: CLOUD_RUN_TASK_INDEX, which tells the job its execution index, and CLOUD_RUN_TASK_COUNT, which always specifies the number of executions.

So, if I have a code like the one you can see on the screen:

import os
def main():
    task_index = os.getenv('CLOUD_RUN_TASK_INDEX')
    task_count = os.getenv('CLOUD_RUN_TASK_COUNT')
    print(f'CLOUD_RUN_TASK_INDEX: {task_index}')
    print(f'CLOUD_RUN_TASK_COUNT: {task_count}')


if __name__ == '__main__':
    main()
Enter fullscreen mode Exit fullscreen mode

With the Dockerfile:

# Use an official Python runtime as a parent image
FROM python:3.8-slim-buster
# Set the working directory in the container to /app
WORKDIR /app
# Add the current directory contents into the container at /app
ADD . /app
# Install any needed packages specified in requirements.txt
# RUN pip install --no-cache-dir -r requirements.txt
# Make port 80 available to the world outside this container
EXPOSE 80
# Run main.py when the container launches
CMD ["python", "app.py"]
Enter fullscreen mode Exit fullscreen mode

If we execute:

gcloud builds submit --tag gcr.io/${PROJECT_ID}/my-cloud-run-app
Enter fullscreen mode Exit fullscreen mode

where PROJECT_ID is your project. Now, if we go to the console and in Cloud Run, we click on "Create Job," we can start to play.

First, we are going to choose the image that we've already created, we give it a name, choose a region, and at this point, we specify how many parallel jobs we want to run. In my case, it's going to be 10.

We're not going to change more options, but as you can see, you have many options, similar to Cloud Run. There's some information that's better to know: there's a maximum of 1,000 jobs in parallel, and when Cloud Run Jobs was announced, the documentation stated a maximum of 200 per CPU with 2GB. So, if you want to execute 1,000 jobs, you are going to need at least 5 CPUs and 10GB.

Now, when we click on "create," we end up with a job that has two options to be executed. We can either create a scheduler that will specify when it's going to be executed, or we execute the job manually, which is what we are going to do.

After a few seconds, the job appears and starts to be executed. When all tasks are complete, we can view the logs. Here we find the two variables we used previously, where CLOUD_RUN_TASK_INDEX will increment from 1 to 10 and CLOUD_RUN_TASK_COUNT will always be 10.

I can find many uses for Cloud Run Jobs and I hope that you can too. See you next time!


Here' the same article in video form for your convenience:

.

Top comments (0)