DEV Community

Cover image for Automating Python Deployments with GitHub Actions, AWS ECR, and AWS Lambda
Rajdip Bhattacharya
Rajdip Bhattacharya

Posted on

Automating Python Deployments with GitHub Actions, AWS ECR, and AWS Lambda

Greetings everyone!

Welcome to the second part of this blog. As promised earlier, I told you that I'll be setting up a deployment pipeline using GitHub Actions for our Python code. And here we are!

A brief revisit

For those, who haven't read the previous blog or would just like to brush up their knowledge, here is what we have built so far:

  • A python code base specialized for a lambda function that extracts a field called name from the lambda event and returns us a greeting.
  • A Dockerfile for containerizing our code.
  • AWS ECR for hosting our image.
  • AWS Lambda for hosting our function.

Additionally, we employed the use of Terraform as IaC to deploy our AWS infrastructure.


This part covers just a single, but a critical aspect - creating the build and deploy pipeline for our Python code. Here's what we are going to do:

  • Create a GitHub repository
  • Write the code for our pipeline
  • Configure the secrets in our repository
  • Push the code, sit back, and watch the magic in action

You can find the code for this blog in here.

So, let's get started!

Creating a GitHub repository

I shouldn't be saying this, but, you do need to have a GitHub account to continue further. Once you have it, go here.

GitHub create repository

I'll be using the above configurations to get started.

Next, you should have a brand-new repository ready at your disposal!

GitHub repository

Building our pipeline

This is perhaps the most critical part. For getting started, we need to create a file in our project as:

Enter fullscreen mode Exit fullscreen mode

The folder naming convention is necessary since GitHub treats all files under the workflows folder as a pipeline.

Next, we create the pipeline.

        branches: [master]

name: Lambda ECR Deployment

        if: github.ref == 'refs/heads/master'
        name: Deploy to ECR and update Lambda
        runs-on: ubuntu-latest

            - name: Checkout
              uses: actions/checkout@v2

            - name: Configure AWS credentials
              uses: aws-actions/configure-aws-credentials@v1
                  aws-access-key-id: ${{ secrets.ACCESS_KEY }}
                  aws-secret-access-key: ${{ secrets.SECRET_KEY }}
                  aws-region: ap-south-1

            - name: Login to Amazon ECR
              id: login-ecr
              uses: aws-actions/amazon-ecr-login@v1

            - name: Build, tag, and push the image to Amazon ECR
              id: build-image
              run: |
                  # Build a docker container and push it to ECR 
                  echo "Registry: ${{ vars.ECR_REGISTRY }}"
                  aws ecr get-login-password --region ap-south-1 | docker login --username AWS --password-stdin ${{ vars.ECR_REGISTRY }}
                  docker build -t ${{ vars.ECR_REGISTRY }}/${{ vars.ECR_REPOSITORY }}:latest .
                  echo "Pushing image to ECR..."
                  docker push ${{ vars.ECR_REGISTRY }}/${{ vars.ECR_REPOSITORY }}:latest
                  echo "name=image::${{ vars.ECR_REGISTRY }}/${{ vars.ECR_REPOSITORY }}:latest" >> $GITHUB_OUTPUT

            - name: Update AWS Lambda function
              id: update-lambda
              run: |
                  aws lambda update-function-code --function-name ${{ vars.LAMBDA_FUNCTION_NAME }} --image-uri ${{ vars.ECR_REGISTRY }}/${{ vars.ECR_REPOSITORY }}:latest
Enter fullscreen mode Exit fullscreen mode

I'll explain what this code does step-by-step. It is recommended that you copy and paste the code into an IDE to follow along the description, as I will be using line numbers.

  1. Lines 1-3 specify which branch this pipeline would activate on. Right now, the pipeline will be activated only when we push to the master branch.
  2. Line 5 states the name of the pipeline
  3. Lines 7-42 define the jobs that will run
  4. Line 8 defines the only job that will be run by this pipeline.
  5. Line 9 ensures to run this pipeline only when the code is pushed into the master branch.
  6. Line 13 lists the steps to be followed in this job
  7. Line 15 uses the community GitHub checkout action to check out the repository on the pipeline whenever a new push is made.
  8. Lines 18-26 tries to log into our AWS account using the credentials provided in lines 20-22.
  9. Lines 31-37 builds, tags, and pushes the docker image to the ECR repository.
  10. Line 42 updates the lambda function's code to deploy the most recent version of our code.

Configuring our GitHub repository

Note that in our workflow file, we have used some special syntax like ${{vars.ECR_REGISTRY}} and ${{secrets.ACCESS_KEY}}. These are GitHub Action Secret and Environmental variables. Using this approach gives us two benefits:

  • We don't need to change our code when some configuration changes.
  • Our secrets stay safe when it's not in the code.

To add your own secrets and environmental variables, head over to your repository and then go to Settings->Secrets and Variables->Actions. Here, we find the following:

GitHub actions

Next step, add the secret (ACCESS_KEY, SECRET_KEY) in the Secrets tab and the rest in the Variables tab like this:

GitHub Action Secrets

GitHub Action Variables

There are some limitations on using the Variables though, you can read about them from here.


Now that we have all the bits in their places, we are ready to jump into action.

To do this,

  • Initialize a git repository in your project
  • Make sure you refactor/remove the access and secret keys from your Terraform config (Only if you have been following since the last blog).
  • Push the code.

After a successful push, you can head over to your repository and click on the Actions tab. Here, you will find your pipelines.

GitHub Action pipelines

Pipeline details

Step details

Now, whenever you make a change to your python code, your changes would be automatically deployed into AWS Lambda when you push your code into the repository!


With that, we come to the end of this blog. I hope you have enjoyed reading through this and didn't actually fall asleep. I hope to bring more such blogs related to DevOps very frequently, so, I would encourage you nice readers to ping my up with ideas that you would like to see in action! Until the next time, happy hacking!

Top comments (0)