To read more articles like this, visit my blog
Today, we will see how we can design an architecture where we will upload a file to AWS S3, and after the file is uploaded successfully, a lambda will be Triggered.
That lambda will download the file and do some operations on it. Some possible options might be
- Generating a thumbnail version of a full-sized image
- Reading data from an Excel file
Ana many more
The final version of the code can be found in
GitHub - Mohammad-Faisal/aws-sam-lambda-trigger-s3-upload
Initialize Your Project
We are going to do this project with AWS Sam. We will use a boilerplate for a typescript setup for this project.
You can clone the below repository to get started.
git clone https://github.com/Mohammad-Faisal/aws-sam-typescript-boilerplate
However, this is not mandatory because the codes I show here can be used in any NodeJS lambda, even as a standalone lambda function.
Step 1:
First, we need some utility functions to download files from S3. These are just pure javascript functions that accept some parameters like a bucket, fileKey, etc, and download the file.
We also have a utility function to upload the file as well.
import aws from 'aws-sdk';
import fs from 'fs';
const s3 = new aws.S3();
export class S3Utils {
static downloadFileFromS3 = function (bucket: string, fileKey: string, filePath: string) {
console.log('downloading', bucket, fileKey, filePath);
return new Promise((resolve, reject) => {
const file = fs.createWriteStream(filePath),
stream = s3
.getObject({
Bucket: bucket,
Key: fileKey
})
.createReadStream();
stream.on('error', reject);
file.on('error', reject);
file.on('finish', () => {
console.log('downloaded', bucket, fileKey);
resolve(filePath);
});
stream.pipe(file);
});
};
static uploadFileToS3 = function (
bucket: string,
fileKey: string,
filePath: string,
contentType: string
) {
console.log('uploading', bucket, fileKey, filePath);
return s3
.upload({
Bucket: bucket,
Key: fileKey,
Body: fs.createReadStream(filePath),
ACL: 'private',
ContentType: contentType
})
.promise();
};
static cleanDownloadedFile = async (filePath: string) => {
await fs.unlink(filePath, (err) => {
console.log('temporary file deleted ');
});
};
}
And finally, another function to delete a file from the local machine.
Step 2:
Then we need the actual lambda handler under the src folder.
In this lambda, the event object will be an S3CreateEvent because we want this function to get triggered when a new file is uploaded to a particular S3 bucket.
Note: This function is used to read .xlsx and .csv files. If you want other files to support, you will have to add those in the supportedFormats array.
import { S3CreateEvent, Context } from 'aws-lambda';
import path from 'path';
import os from 'os';
import { S3Utils } from '../utils/s3-utils';
const supportedFormats = ['csv', 'xlsx'];
function extractS3Info(event: S3CreateEvent) {
const eventRecord = event.Records && event.Records[0];
const bucket = eventRecord.s3.bucket.name;
const { key } = eventRecord.s3.object;
return { bucket, key };
}
exports.handler = async (event: S3CreateEvent, context: Context) => {
try {
const s3Info = extractS3Info(event);
const id = context.awsRequestId;
const extension = path.extname(s3Info.key).toLowerCase();
const tempFile = path.join(os.tmpdir(), id + extension);
const extensionWithoutDot = extension.slice(1);
console.log('converting', s3Info.bucket, ':', s3Info.key, 'using', tempFile);
if (!supportedFormats.includes(extensionWithoutDot)) {
throw new Error(`unsupported file type ${extension}`);
}
await S3Utils.downloadFileFromS3(s3Info.bucket, s3Info.key, tempFile);
// do whatever you want to do with the file
// contentType = `image/${extensionWithoutDot}`;
// await S3Utils.uploadFileToS3(OUTPUT_BUCKET, s3Info.key, tempFile, contentType);
await S3Utils.cleanDownloadedFile(tempFile);
} catch (err) {
console.log(JSON.stringify(err));
}
};
Step 3:
The final piece of the puzzle is to update the template.yaml file. Here we add three things.
An S3 bucket where we will upload files.
A Lambda will get triggered when a new file is uploaded into the bucket. Notice the Events property where we specify that the event will be s3:ObjectCreated. We also link the bucket here.
A Policy that will allow the lambda to read the contents of the s3 bucket. We will also attach the policy with the role of the function. (A role is created with each function.
So the LambdaThatWillReactToFileUpload function will have a role named LambdaThatWillReactToFileUploadRole)
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: AWS SAM demo Lambda react to file uploaded to s3
Globals:
Function:
Runtime: nodejs14.x
Timeout: 30
Resources:
S3BucketToRespond:
Type: AWS::S3::Bucket
Properties:
BucketName: 'Dummy-Bucket'
LambdaThatWillReactToFileUpload:
Type: AWS::Serverless::Function
Properties:
CodeUri: src/s3-file-upload-reaction
Handler: app.handler
Events:
FileUploadedToS3:
Type: S3
Properties:
Bucket: !Ref S3BucketToRespond
Events: s3:ObjectCreated:*
ReadS3BucketPolicy:
Type: AWS::IAM::Policy
Properties:
PolicyName: ReadS3BucketPolicy
PolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- s3:GetObject
Resource:
- !Sub '${S3BucketToRespond.Arn}/*'
Roles:
- !Ref LambdaThatWillReactToFileUploadRole
We added the Extra policy to avoid the circular dependency problem. And that's it. Now you will deploy the code depending on your region.
To deploy your application, you first configure your environment. You can find the details here.
Then run the following command to deploy
sam deploy --guided
Test it
To test whether it works, go to the aws s3 console, upload a file, and check the logs.
To check the logs from the local machine
sam logs -n LambdaThatWillReactToFileUpload --stack-name sam-lambda-trigger-s3-file-upload --tail
And you will see the logs there.
That's it for today. I hope you enjoyed this article. Have a great day! :D
Have something to say? Get in touch with me via LinkedIn or Personal Website
Top comments (0)