DEV Community

Cover image for Day 41: Using Python in an AWS Lambda Function
John Enad
John Enad

Posted on

Day 41: Using Python in an AWS Lambda Function

Eager to share some insights into AWS Lambda and Python that I've gained, I've come up with a straightforward example to demonstrate this technology. I've crafted a so-called "nano-level" project - by this, I mean an extremely simple project with no consequential practical application in the real world. However, this project will serve as an easy-to-understand example of these technologies.

This Python tutorial assumes a basic hands-on understanding of AWS technologies, specifically the AWS Simple Storage Service (S3) and AWS Lambda. I won't be going into intricate details about these parts of the tutorial, instead providing a high-level explanation of how all the pieces work together to solve a simple problem.

The goal of our nano-level project is to set up a process which involves the creation of an S3 bucket that accepts only files from a specific user with a .json extension. The JSON file must contain the following keys: 'action', 'value1', 'value2', and 'created_by'.

Here are 4 example files:

add-num.json: {"action": "ADD", "value1": 5, "value2": 2, "created_by": "user1"}
sub-num.json: {"action": "SUB", "value1": 5, "value2": 2, "created_by": "user1"}
mult-num.json: {"action": "MULT", "value1": 5, "value2": 2, "created_by": "user1"}
div-num.json: {"action": "DIV", "value1": 5, "value2": 2, "created_by": "user1"}
Enter fullscreen mode Exit fullscreen mode

Here, 'action' contains the code of the operation to perform, 'value1' and 'value2' represent the first and second values respectively, and 'created_by' is the username of the person who requested the operation.

As soon as any of these .json files are uploaded, it automatically triggers the execution of an AWS Lambda function. This function locates the .json file in the S3 bucket, retrieves its JSON content, and depending on the request, performs an operation on the values and computes the result.

1) First, create the 4 .json files mentioned above somewhere in a local folder.
2) In the AWS Console, navigate to AWS Lambda and Create a Function with a name of your choice. Make sure to select Python 3.10, 3.9 or 3.8
as the Runtime.

In the lambda_function.py enter the following code:

import json
import boto3

s3_client = boto3.client('s3')

def lambda_handler(event, context):
    bucket = event['Records'][0]['s3']['bucket']['name']
    file_name = event['Records'][0]['s3']['object']['key']
    content_object = s3_client.get_object(Bucket=bucket,Key=file_name)

    try:
        file_content = content_object.get('Body').read().decode('utf-8')
        json_content = json.loads(file_content)

        action = json_content["action"]
        if action == "ADD":
            print("ADD operation")
            result = int(json_content["value1"]) + int(json_content["value2"])
        elif action == "SUB":
            print("SUB operation")
            result = int(json_content["value1"]) - int(json_content["value2"])
        elif action == "MULT":
            print("MULT operation")
            result = int(json_content["value1"]) * int(json_content["value2"])
        elif action == "DIV":
            print("DIV operation")
            result = int(json_content["value1"]) / int(json_content["value2"])
        else:
            print("INVALID OPERATION")
            result = None

        json_content["result"] = result
        print(json_content)

        return json_content
    except Exception as e:
        print(e)
        print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(file_name, bucket))
        raise e
Enter fullscreen mode Exit fullscreen mode

The Python code above listens to events in an AWS S3 bucket and performs an arithmetic operation based on the contents of the JSON files uploaded to the bucket. import json is for parsing JSON data. boto3 is the AWS SDK for Python. It allows Python developers to write code that uses Amazon Services like S3 or Lambda.

Initializes s3_client to allow the code to interact with Amazon S3.
s3_client = boto3.client('s3')

Gets the file from the S3 bucket
content_object = s3_client.get_object(Bucket=bucket,Key=file_name)

Gets the Body portion of the content_object
file_content = content_object.get('Body').read().decode('utf-8')

Loads the JSON from the file content
json_content = json.loads(file_content)

Note: Change the execution role to create a new role from AWS Policy templates and select Amazon S3 object read-only permissions.

3) Create an S3 bucket with a name of your choice and add a bucket policy that restricts access to a single user (yourself)
and only allows files with .json extension. Example follows:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Statement1",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::[ACCOUNT NUMBER HERE]:user/[USER NAME HERE]"
            },
            "Action": "s3:PutObject",
            "Resource": "arn:aws:s3:::[BUCKET NAME HERE]/*.json"
        },
        {
            "Sid": "Statement2",
            "Effect": "Deny",
            "Principal": "*",
            "Action": "s3:PutObject",
            "NotResource": "arn:aws:s3:::[BUCKET NAME HERE]/*.json"
        }
    ]
}
Enter fullscreen mode Exit fullscreen mode

4) Edit your S3 bucket and create an Event Notification (under Properties) to trigger the Lambda you created when an object is created.

Image description

Now, when you try to upload any of the .json files that you created in step 1, the Python code will get executed.

Image description

From CloudWatch, you can look at the logs:

Image description

Remember that this project serves as a trivial example to show how Python can be used in AWS Lambda. Although our project may not have a practical real-world application, it is the understanding of the mechanisms behind it that is important.

Top comments (0)