In the previous articles, we saw how to create a container and upload the container to AWS repositories. if you didn’t visit that article please click on the below name:
Run a Python code on AWS Batch Part — 1: Creation of Python Script and Docker Container: https://medium.com/codex/run-a-python-code-on-aws-batch-part-1-creation-of-python-script-and-docker-container-1b01dc89eaed
**Run a Python code on AWS Batch Part — 2: Uploading Data to ECR and Creation of Computing Environment.: **https://medium.com/codex/run-a-python-code-on-aws-batch-part-2-uploading-data-to-ecr-and-creation-of-computing-c5dab12cd3eb
In this session, we are going to run the python code:
Create a Job Queues
Create a Job definitions
Create a Jobs
Run a Jobs
- Create a Job Queues
First, click on the Job Queues on AWS Dashboard and then click on “Create”.
And I am going to give the name in the Job queue name “test-queue-batch-v1” and you can give the name of your own interest. The priority I am going to give is 1000 because Job queues with a higher integer value for priority are given preference for computing environments.
I am going to select the Compute Environment which is necessary that the container will run this Compute Environment. In my case, I have given the Compute Environment name as “demo-batch-python-v1”. Please click on the radio button, make sure the radio button is in blue colour.
And click on create and we will see a “VALID” message on the status. And green colour banner will come on the top side of the page.
- Create a Job definitions
If we see the left-hand side of the navigation bar of AWS Batch Dashboard we see the Job definitions click on that and click on the “Create” button just like we did on Job queues.
After clicking the “Create” button, it will enter the form in which we need to fill in certain parameters for job definitions. And I am taking the name “test-job-def-v1” and I am taking the Execution timeout of the code near about 1 hour, if our code execution time will go beyond 1 hour it will terminate the execution of the code.
Now we have to select EC2 for platform compatibility or you can choose Fargate if you are familiar with that.
After that, we are going to give you your Docker Container location where you kept it. You can see that in the below image I stored my Container in sider Amazon ECR and copy that public ECR which is a green colour mark in that image
And paste that link inside the image field.
After setting up all the job configurations, and then click on create button.
Now our next task is to run that Image file i.e. Docker Container on AWS Batch by creating a job.
Now click on Jobs **on the left-hand side of the navigation bar. And now click on **Submit new job
After clicking on Submit new job, we will be redirected to the form page and we have to fill in some detail step by step.
Inside the **Name **field, we can add job name of our own, and we already created the job definition and job queue.
If we select the **job definition **all the fields are automatically filled by **AWS **itself.
And same for **job configuration **is filled automatically, we just need to verify.
After that click on submit. And after that go to Dashboard and we can see our task is submitted.
We can see different columns in the above images which show the status of our AWS batch jobs or we can say that different stages of the Batch jobs.
Stages ranges from Submitted -> Runnable -> Starting ->Running -> Succeeded/Failed.
All the stages are completed automatically, we can observe the output in Succeeded/ Failed columns.
In my scenario, my job was executed properly and I can see the result by clicking on the number which is **“2”, **I had run that same job previously, that’s why it is showing 2 otherwise it will show 1, and if your job will fail, it will show 1 in Failed columns.
In the above image in my scenario, I will click on 2 and it will be redirected to the list of jobs which is succeeded.
I am going to click on **“my-job-v1”. **Because we had run the job right now.
And it will move to the Job Information **page. And we see the **Log stream name, **below that there is link is provided please click on that it contains the output. That links will get to the **AWS Cloudwatch.
What is AWS Cloudwatch? In my words, I can say that it is used to view the output of any services like AWS Batch, AWS Lambda etc.
And you can get more detail by just clicking on **AWS Cloudwatch.**
And we can see the above Image, i.e. the output of the jobs.
And we have successfully run the AWS Batch.