DEV Community πŸ‘©β€πŸ’»πŸ‘¨β€πŸ’»

DEV Community πŸ‘©β€πŸ’»πŸ‘¨β€πŸ’» is a community of 968,547 amazing developers

We're a place where coders share, stay up-to-date and grow their careers.

Create account Log in
Shoeb Ahmed
Shoeb Ahmed

Posted on

Run a Python code on AWS Batch Part β€” 2: Uploading Data to ECR and Creation of Computing Environment.

Run a Python code on AWS Batch Part β€” 2: Uploading Data to ECR and Creation of Computing Environment.

In the previous article, I posted that how can we create and run a simple python script. And how to create a simple Docker File and Docker container that runs on a local system.

AWS Batch + Python

In this article, we are going for the AWS Batch side. For running an AWS Batch we require an Image File (Docker Container) and Computing Environment.

  1. Create a repository in β€œAmazon Elastic Container Registry”

  2. Push the Docker Container into the repository.

  3. Creating a β€œCompute Environment” in the AWS batch.

For running a Docker Container on AWS Batch we need to store it on the AWS platform, we are going to use **β€œAmazon Elastic Container Registry”, **where we can store our Docker Container. Search **Amazon Elastic Container Registry **and we see the dashboard of ECR.

Click β€œGet Started”. After that, we will visit to create a repository page.

I am going to use public because we are in the learning phase.

And give the name of the repository which is a compulsory field.

I am selecting all **β€œContent types” **and you can choose your content types.

And then I am going to click **β€œCreate repository”. **After that our page will redirect to the repository list their we can see our new repository list.

Now we click on our repository name and after that, we will click on β€œView push commands”.

After clicking on that button we will see a list of push commands we need to follow all the commands.

So, I am going to execute the command line by line in the command prompt.

Make sure that your current working directory is should be where you saved your python script and Docker file also.

Now we first going to copy the first command.

And we are getting the β€œLogin Succeeded”. If you are not getting login succeeded please try to configure the AWS CLI again.

Now I am going for the second command.

Third command.

Now the last command will push your image into repositories.

We will see in the above screenshot that there is an image whose name is β€˜latest’.

Now we will move into AWS Batch, first, we will create β€œCompute Environments”.

Click on compute environment on the left-hand side of the AWS Batch dashboard.

Click on β€œCreate” on the right-hand side of Compute Environments.

Fill in the details given below:

And in the Instance configuration, I am using Spot.

Instance Configuration β€” 1

Instance Configuration β€” 2

For **Networking, **I am using the same VPC Id which I had used in AWS Redshift and the link is here: https://medium.com/codex/aws-redshift-connects-with-python-part-1-setup-a-redshift-connection-with-python-b9f6a1fa49f0

After that click on Create Compute Environment. And we will see the status of the computing environment as shown in the image which is valid.

We will see the creation of Job Queue and Job Definition in the next article which is part 3.

Top comments (0)

🌚 Friends don't let friends browse without dark mode.

Sorry, it's true.