DEV Community

Fernando Pena
Fernando Pena

Posted on • Edited on • Originally published at blog.pena.rocks

Step by Step - Creating an AWS Lambda Function from scratch using Amazon S3 triggers.

For this scenario, as an example we will create a very simple function in AWS Lambda using Python, to be executed every time a new file is inserted into an S3 folder.


AWS Lambda Introduction

The example aims to demonstrate only the creation of the function and understanding of the trigger that triggers the function, just in a superficial way. In future posts I will go into more depth and bring up specific scenarios.

For this we will follow the following architecture:
Image description

I'm considering here exactly the point of creating the role, so I'm skipping steps like creating and setting up your AWS account if you don't already have one. And I won't go into detail about S3, so as not to make this post too long.

About S3, there is a video on my Youtube channel where I explain a little more about S3 and also do the step by step of creating a bucket.

Video about Amazon S3 (Simple Storage Service):
[Step-by-step] Amazon S3 - How does it work? And how to create your bucket


Let's go step by step:

1st - Let's start by creating and organizing your Bucket:

Create a bucket in S3, or use an existing one. For this demonstration I will create a specific one.
Image description

Create two folders inside your bucket, named "Inbox" and "log".

Your bucket will look like the following structure:
Image description

IMPORTANT: We are creating separate folders, so that the trigger inside the bucket does not get into an "infinite loop", which will generate thousands of processes, including a high cost due to this. The good practice in a productive environment would be to make this transition between different buckets, but here we are simplifying just to exemplify.


2nd - Before creating the function, let's create an "Execution Role" that allows the manipulation of objects inside this bucket.

Go to IAM in the console or search in the search bar:
Image description

On the left-hand side, go to "Roles" and then click "Create Role".
Image description

Select "AWS service" and under "Common use cases" select Lambda.
Image description

And click Next: Permissions.

In the filter bar, look for S3 and select Policy "AmazonS3FullAccess", and then look for "logs" and select Policy "CloudWatchLogsFullAccess".
Image description
Image description

Click "Next: tags", and then "Next: Review".

Provide the name for your new Role, change the description if you want, check the polices, and then click "Create Role".
Image description

IMPORTANT: We are creating a very comprehensive role, which is not a good security practice, but in this case we are simplifying it just to exemplify the execution of the function using S3.


3º - Now let's actually create your role in AWS Lambda:

In your AWS console go to the "Lambda" service under Compute, or search for "Lambda" in the search bar.
Image description

Click on "Create a function".
Image description

In the form for creating your function, select the option "Author from scratch" and fill in the following parameters:

  • Function name: LogMyInboxFiles (Or any name of your choice)
  • Runtime: Python 3.8 (In this example we will use Python) Image description

Next under "Permissions", click on "Change default execution role" to expand the options.

Then select the "Use an existing role" option, and in the "Existing role" field, select the role that we created earlier in this tutorial. In this case, "LambdaAccessS3Medium".
Image description

Then click "Create Function". Your function will be created in a few seconds and then you will access the edit screen of your function.
Image description

On the next screen, go to the part where you have the initial function code, this is exactly where you code according to your runtime, and then replace the current code with this one below:

import os
import boto3

s3Info = boto3.client('s3')

def lambda_handler(event, context):

    # get your bucket and key from event data
    data_rec = event['Records'][0]['s3']
    bucket = data_rec['bucket']['name']
    key = data_rec['object']['key']

    # generate new key name for log folder
    new_key = "log/%s.txt" % os.path.basename(key)

    # create new obj with compressed data
    s3Info.put_object(
        Body=key,
        Key=new_key,
        Bucket=bucket,
    )

    return "OK"
Enter fullscreen mode Exit fullscreen mode

Click "Deploy".


4 - Let's now configure the S3 trigger to run our function.

Click on "+ Add trigger", and in the selection box choose "S3".
Image description
Image description

Fill out the screen according to the parameters below:
Bucket: The name of your bucket
Event type: Keep the default - "All object create events
Prefix: inbox/ (important to include "/")
Sufix: Optional, you can "filter" files here for example according to a certain extension.

And check the box "I acknowledge that using...".
Image description

I will repeat the important warning here again. CAUTION.

IMPORTANT: We are creating separate folders, so that the trigger inside the bucket does not get into an "infinite loop", which will generate thousands of processes, including a high cost due to this. Good practice in a productive environment would be to make this transition between different buckets, but here we are simplifying just to exemplify.

Then click "Add" to add your trigger to the function.


5th - All done, time to test.... now cross your fingers and hope it works! 😄

Go to your bucket and upload a simple file in the "Inbox" folder.
Image description

And then go to your "Log" folder and check if a new file with "log-.txt" has been created.
Image description

If it hasn't been created yet, wait and hit "Refresh" on the folder. Once it is created, open the file and see that we have recorded the full name and path of the file that was placed in the inbox. Still, if the file has not been created, you need to revisit the created Role and verify that the permissions are present there.

If you open the file, you will see something like this:
Image description

IMPORTANT: After running your tests, remember to delete all the resources created in the tests, to avoid any unnecessary charges.


Conclusion

In this post I introduced you to the AWS Lambda service, of course in a superficial way, where I created a function with a trigger in Amazon S3 for every time we put a file in the folder. This shows a bit of the potential of the service and how simple it is to configure and code.

In upcoming posts and videos I will bring you more advanced content, including how you create a function in AWS Lambda and access it externally using AWS API Gateway and AWS Cognito.

So stay tuned both here and on my youtube channel and instagram to follow the news.

Subscribe to my Youtube channel:
Youtube: Pena Rocks

Follow me on social networks:
Instagram: https://www.instagram.com/pena.rocks/
Twitter: https://twitter.com/nandopena
LinkedIn: https://www.linkedin.com/in/nandopena/

More Information:
https://www.pena.rocks/

Top comments (0)