DEV Community

Marton Veto
Marton Veto

Posted on • Updated on

File upload with AWS Lambda and S3 in Node

If you have a Lambda function in Node and want to upload files into S3 bucket you have countless options to choose from. In this article, I'll present a solution which uses no web application frameworks (like Express) and uploads a file into S3 through a Lambda function. The HTTP body is sent as a multipart/form-data.

The code

For parsing the multipart/form-data request I use the lambda-multipart package. This package can parse both text and file content and this is how I use it:

const parseMultipartFormData = async event => {
  return new Promise((resolve, reject) => {
    const parser = new Multipart(event);

    parser.on("finish", result => {
      resolve({ fields: result.fields, files: result.files });
    });

    parser.on("error", error => {
      return reject(error);
    });
  });
};
Enter fullscreen mode Exit fullscreen mode

In the files list I will have a list of Buffer objects.

This is how I call it and loop through all the files to upload them:

  const { fields, files } = await parseMultipartFormData(event);

  await Promise.all(
    files.map(async file => {
      await uploadFileIntoS3(file);
    })
  );
Enter fullscreen mode Exit fullscreen mode

And finally, uploading a file into S3:

const uploadFileIntoS3 = async file => {
  const ext = getFileExtension(file);
  const options = {
    Bucket: process.env.file_s3_bucket_name,
    Key: `${uuidv4()}.${ext}`,
    Body: file
  };

  try {
    await s3.upload(options).promise();
  } catch (err) {
    console.error(err);
    throw err;
  }
};
Enter fullscreen mode Exit fullscreen mode

I use the uuid library to get a unique identifier what I'll use as the name of the file. Be aware that if your files are Buffer objects you can pass them to the upload method of the S3 SDK, but you cannot pass Buffer objects to the putObject method! In the catch block you should add some meaningful error handling. I just logged the error and re-threw it to be able to see it on the caller side.

You can add some file verifications to check the MIME type of the files and the size. ⚠️ But watch out, currently, Lambda has multiple limitations. One of them is that it only supports HTTP requests which are less than 6MB in size, so if you want to upload files which are bigger than this limit, you cannot use this solution.

Don't forget to create some IAM role (and associate it to the Lambda function) to be able to put an object into the S3 bucket.

This is how I'm getting the extension of the file:

const getFileExtension = file => {
  const headers = file["headers"];
  if (headers == null) {
    throw new Error(`Missing "headers" from request`);
  }

  const contentType = headers["content-type"];
  if (contentType == "image/jpeg") {
    return "jpg";
  }

  throw new Error(`Unsupported content type "${contentType}".`);
};
Enter fullscreen mode Exit fullscreen mode

And basically, that's all. You can find the full source code here. I am using the Serverless Framework for deploying my Lambda functions and for creating an S3 bucket.

Top comments (2)

Collapse
 
sibiakkash profile image
SibiAkkash

Was looking for a solution for a long time, finally found this ! Thankss 👍👍👍

Collapse
 
ealdwairi profile image
Eklas

thank you.

now using above how can I add checksum to upload method