DEV Community

Cover image for Uploading Images to AWS S3 with Next.js and React Dropzone — A Complete Guide
Francisco Luna 🌙
Francisco Luna 🌙

Posted on

Uploading Images to AWS S3 with Next.js and React Dropzone — A Complete Guide

I’ve been learning how to set up AWS S3 with Next.js at work for a few weeks. At first, it felt a bit overwhelming with different approaches and configurations to consider. After a while, this task got easier and I integrated S3 to my tech stack.

Adding S3 can boost your project's functionality and performance, especially when you need to work with images and files in your application. That's why In this guide, I'll take you step-by-step through setting up S3 with Next.js. We'll create a small application where users will be able to add images from their devices through a drop zone component, to be uploaded to S3.

What is AWS S3?

Amazon S3 or Simple Storage Service is one of the core services of Amazon Web Services. It gives you a highly scalable, reliable and secure object storage service on the cloud.

S3 offers developers and businesses a highly reliable and cost-effective solution for storing and retrieving unlimited amounts of data, accessible anytime and from anywhere on the web.

And here's an interesting tidbit: the platform you're reading this on, Dev.to, uses Amazon S3 to store all the blog images! It's a perfect real-world example of how S3 is seamlessly integrated into the digital experiences we use every day.

Requirements

Before diving into this guide, it’s essential to have a foundational understanding of the following technologies:

1. Next.js: Familiarize yourself with the basics of Next.js, a React framework that enables server-side rendering and static site generation.

2. React: Ensure you have a good grasp of React, as Next.js is built on top of it.

3. AWS IAM User: Set up an AWS Identity and Access Management (IAM) user with the necessary permissions to access and manage your S3 bucket.

4. S3 Bucket: Create and configure an Amazon S3 bucket to store your files. Make sure to note down the access key and secret key for your IAM user, and configure the bucket’s permissions and CORS settings as needed. Save the bucket name as well.

What you'll be building

In this tutorial you'll be building a simple application that allows users to upload images to S3 thanks to a dropzone component. The images will be retrieved programmatically from a S3 bucket and the users can also delete each image.

Simple application made using Next.js, Server Actions, AWS S3 and React-Dropzone

This will teach you:

  1. How to use Next.js Server Actions: Learn how to use serverless functions to leverage Next.js latest features and best practices to do form submissions seamlessly.

  2. How to use the AWS S3 SDK for Node.js: You'll learn how to upload images, retrieve them and how to delete images programmatically.

  3. Integrate React Dropzone into your application: You're going to add React Dropzone with validations, type safety, hooks and constants to use best practices and ensure a nice user experience.

Installing and Configuring Next.js

To install Next.js, you can use the following command: npx create-next-app@latest.

Learn more on the official docs.

Setting up dependencies

Once you have installed Next.js, you'll need to install React Drop Zone and the AWS S3 SDK for Node.js.

To install these dependencies, you can use the following command: npm install react-dropzone @aws-sdk/client-s3

Learn more about the AWS S3 SDK here

Learn more about React Drop zone here

Creating the Server Actions

Before building the user interface focus on creating the business logic through server actions. They're the same as the lambda functions from AWS and you can use them for form submissions.

For this guide, you'll need to create 2 server actions; One to upload an image to S3 and another one to delete an image.

Create a new folder called actions inside your src folder and start creating your first server action there:

Server Action to Upload an Image to a Bucket

This server action is used to upload images to an S3 bucket. It takes two arguments: formData and payload.

formData: Contains the image data sent from the frontend.
payload: An object with two fields, bucket and key.

Payload Fields

bucket: The name of your S3 bucket.
key: A unique identifier for each image in the bucket.

Every image in S3 needs a unique key to ensure that each file is stored separately.

// /src/actions/s3.ts
type UploadImageToS3Payload = {
  bucket: string;
  key: string;
};

export async function uploadImageToS3(
  formData: FormData,
  payload: UploadImageToS3Payload
): Promise<any[]> {
  const s3 = new S3Client({
    region: process.env.AWS_REGION,
    credentials: {
      accessKeyId: process.env.AWS_ACCESS_KEY as string,
      secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY as string,
    },
  });

  const { bucket, key } = payload;

  try {
    const files = formData.getAll("file") as File[];

    const response = await Promise.all(
      files.map(async (file) => {

        const arrayBuffer = await file.arrayBuffer();
        const buffer = Buffer.from(arrayBuffer);

        const fileUploadParams = {
          Bucket: bucket,
          Key: key,
          Body: buffer,
          ContentType: file.type,
        };

        const imageParam = new PutObjectCommand(fileUploadParams);
        await s3.send(imageParam);
      })
    );

    revalidatePath("/");
    return response;

  } catch (error) {
    console.error("Error uploading image to S3:", error);
    throw new Error("Failed to upload image to S3.");
  }
}
Enter fullscreen mode Exit fullscreen mode

What's Going On Here?

Let's break down what's happening in the code step by step.

1. Initializing the S3 client

First, we create an instance of the S3 client using the following code. Remember to use environment variables to keep credentials safe as well:

const s3 = new S3Client({
  region: process.env.AWS_REGION,
  credentials: {
    accessKeyId: process.env.AWS_ACCESS_KEY as string,
    secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY as string,
  },
});
Enter fullscreen mode Exit fullscreen mode

Here you're passing the AWS region and the necessary credentials (access key and secret access key) to the S3Client class. These credentials are typically stored in environment variables for security reasons. The S3Client is the object that allows you to interact with Amazon S3. Giving you the possibility to upload files.

2. Extracting Payload and Files Information

Next, you'll need extract the bucket and key from the payload, which are essential for specifying where and how each file will be stored in S3:

const { bucket, key } = payload;
Enter fullscreen mode Exit fullscreen mode

Then, we gather all the files from the formData object:

const files = formData.getAll("file") as File[];
Enter fullscreen mode Exit fullscreen mode

The formData.getAll("file") method retrieves all files associated with the "file" field from the form data submitted by the frontend. This array of files will be processed and uploaded to S3.

3. Processing and Uploading Each File

For each file, we perform the following steps:

const response = await Promise.all(
  files.map(async (file) => {
    const arrayBuffer = await file.arrayBuffer();
    const buffer = Buffer.from(arrayBuffer);

    const fileUploadParams = {
      Bucket: bucket,
      Key: key,
      Body: buffer,
      ContentType: file.type,
    };

    const imageParam = new PutObjectCommand(fileUploadParams);
    await s3.send(imageParam);
  })
);
Enter fullscreen mode Exit fullscreen mode

Convert File to Buffer:

We start by converting the file into an ArrayBuffer, which represents the file's binary data. Then, we create a Buffer from this ArrayBuffer. This Buffer is what we'll actually send to S3 as the file's content.

Prepare Upload Parameters:

We define fileUploadParams, an object that specifies the details of the upload, including the S3 bucket name (Bucket), the unique key (Key), the file content (Body), and the file type (ContentType).

Upload the File:

The PutObjectCommand is used to create a command for uploading the file to S3.

Finally, we execute this command using s3.send(imageParam), which uploads the file to the specified bucket with the given key.

By using Promise.all, we ensure that all files are processed and uploaded in parallel, improving the efficiency of the operation.

4. Revalidating the Path

Once the files are successfully uploaded, we call revalidatePath("/") to refresh the content or invalidate the cache for the root path:

revalidatePath("/");
Enter fullscreen mode Exit fullscreen mode

This is useful if your application needs to update the displayed content or reflect the changes made by the file uploads.

5. Error Handling

If anything goes wrong during the upload process, we catch the error, log it to the console, and throw a new error to ensure the issue is properly handled:

catch (error) {
  console.error("Error uploading image to S3:", error);
  throw new Error("Failed to upload image to S3.");
}
This helps in diagnosing issues and ensures that the application can gracefully handle failures.
Enter fullscreen mode Exit fullscreen mode

Server Action to Delete an Image from S3

This server action deleteImageFromS3, is designed to delete an image stored in an Amazon S3 bucket. It accepts an object containing the Bucket name and the Key of the image to be deleted.

type S3ParamsPayload = {
  Key?: string;
  Bucket?: string;
};

export async function deleteImageFromS3({
  Bucket,
  Key,
}: S3ParamsPayload): Promise<DeleteObjectCommandOutput> {
  const s3 = new S3Client({
    region: process.env.AWS_REGION,
    credentials: {
      accessKeyId: process.env.AWS_ACCESS_KEY!,
      secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
    },
  });

  const command = new DeleteObjectCommand({ Bucket, Key });

  console.log(command);

  try {
    const res = await s3.send(command);
    revalidatePath("/");

    console.log(res);
    return res; 

  } catch (error) {
    console.error("Error deleting image from S3:", error);
    throw new Error("Failed to delete image from S3.");
  }
}
Enter fullscreen mode Exit fullscreen mode

What's going on here?

Let's break down what's happening in this server action:

1. Type Definition

First, we define the S3ParamsPayload type:

type S3ParamsPayload = {
  Key?: string;
  Bucket?: string;
};
Enter fullscreen mode Exit fullscreen mode

Where key is the unique identifier of the image (name or path) of the image within the S3 bucket and Bucket refers to the name of the S3 bucket where the image is stored.

2. Initializing the S3 client

We create again the S3 client which is necessary to interact with AWS S3.

// Provide the respective region and credentials from your AWS account 
const s3 = new S3Client({
  region: process.env.AWS_REGION,
  credentials: {
    accessKeyId: process.env.AWS_ACCESS_KEY!,
    secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
  },
});
Enter fullscreen mode Exit fullscreen mode

3. Creating the Delete Command

The DeleteObjectCommand is an AWS SDK command that specifies the exact file to delete from the specified bucket.

const command = new DeleteObjectCommand({ Bucket, Key });
Enter fullscreen mode Exit fullscreen mode

4. Executing the Delete Command

This line sends the DeleteObjectCommand to AWS S3, which processes the request and deletes the specified image from the bucket.

const res = await s3.send(command);
Enter fullscreen mode Exit fullscreen mode

5. Revalidating the Path*
Invalidate the cache for the root path of the application after deleting an image so the changes are reflected in real time.

revalidatePath("/");
Enter fullscreen mode Exit fullscreen mode

6. Error Handling
Add basic error handling to manage any issues that might arise during the deletion process.

catch (error) {
  console.error("Error deleting image from S3:", error);
  throw new Error("Failed to delete image from S3.");
}
Enter fullscreen mode Exit fullscreen mode

Retrieving the Images

Let's create a new folder inside src called services. This folder will be used to perform async operations with external services such as AWS S3. Create a new file called s3.ts and add the following function to get the signed URLs from a bucket. This is the way we can get images using AWS S3 in a secure way.

Signed URLs are temporary, secure URLs that allow access to objects in an S3 bucket for a limited period of time. By signing the URLs, you control who can access your S3 objects and for how long, without making the bucket or objects publicly accessible.

// src/services/s3.ts
export async function getAllObjectsSignedUrls({
  Bucket,
}: S3ParamsPayload): Promise<{ key: string; url: string }[]> {
  const s3 = new S3Client({
    region: process.env.AWS_REGION,
    credentials: {
      accessKeyId: process.env.AWS_ACCESS_KEY!,
      secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
    },
  });

  try {
    // List all objects in the bucket
    const listCommand = new ListObjectsV2Command({ Bucket });
    const listObjectsOutput: ListObjectsV2CommandOutput = await s3.send(
      listCommand
    );

    const signedUrls = await Promise.all(
      (listObjectsOutput.Contents || []).map(async (object) => {
        if (object.Key) {
          const getObjectCommand = new GetObjectCommand({ Bucket, Key: object.Key });
          const url = await getSignedUrl(s3, getObjectCommand, { expiresIn: 3600 });
          return { key: object.Key, url };
        }
        return null;
      })
    );

    // Filter out any null results (in case an object didn't have a Key for some reason)
    return signedUrls.filter((item): item is { key: string; url: string } => item !== null);
  } catch (error) {
    console.error("Error retrieving objects from S3:", error);
    throw new Error("Failed to retrieve objects from S3.");
  }
}
Enter fullscreen mode Exit fullscreen mode

What's going on here?

1. Initialize the S3 client

As with the server actions, you need to initialize the S3 client once again:

// Provide the respective region and credentials from your AWS account
const s3 = new S3Client({
  region: process.env.AWS_REGION,
  credentials: {
    accessKeyId: process.env.AWS_ACCESS_KEY!,
    secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
  },
});
Enter fullscreen mode Exit fullscreen mode

2. List Objects in the S3 Bucket

We use the ListObjectsV2Command to retrieve all objects in the specified bucket:

const listCommand = new ListObjectsV2Command({ Bucket });
const listObjectsOutput: ListObjectsV2CommandOutput = await s3.send(listCommand);
Enter fullscreen mode Exit fullscreen mode

This command will list all the objects stored in the S3 bucket, returning metadata about each object, including the key and last modified date.

3. Generate Signed URLs

Once we have the list of objects, the next step is to generate signed URLs for each object:

const signedUrls = await Promise.all(
  (listObjectsOutput.Contents || []).map(async (object) => {
    if (object.Key) {
      const getObjectCommand = new GetObjectCommand({ Bucket, Key: object.Key });
      const url = await getSignedUrl(s3, getObjectCommand, { expiresIn: 3600 });
      return { key: object.Key, url };
    }
    return null;
  })
);
Enter fullscreen mode Exit fullscreen mode

We've generated temporary urls which grant secure access to the objects stored in S3. In this case, we use the getSignedUrl function to generate URLs that expire after 3600 seconds (1 hour).

We also use Promise.all to handle all the asynchronous URL generation tasks concurrently which improves performance by reducing the total time needed to process all the objects.

4. Filter Out Null Values

In some cases, an object might not have a key, or the operation might fail for some objects. To ensure that the final result only contains valid entries, we filter out any null values:

return signedUrls.filter((item): item is { key: string; url: string } => item !== null);
Enter fullscreen mode Exit fullscreen mode

5. Error Handling

As with any operation that involves external services, there’s a chance something could go wrong. To handle potential issues gracefully, the function includes error handling:

catch (error) {
  console.error("Error retrieving objects from S3:", error);
  throw new Error("Failed to retrieve objects from S3.");
}
Enter fullscreen mode Exit fullscreen mode

Creating the Drop zone Component

The Dropzone component allows users to drag and drop images for upload. We set it up to handle a maximum of 3 images, each up to 2MB in size. The react-dropzone library is used to create the drag-and-drop area and manage the upload process.

Here's the code for the Dropzone component:

"use client";

import { FileError, FileRejection, useDropzone } from "react-dropzone";
import React, { useState } from "react";
import { uploadImageToS3 } from "@/app/actions/s3";

const MAX_FILE_SIZE_MB = 2;
const MAX_FILE_SIZE_BYTES = MAX_FILE_SIZE_MB * 1024 * 1024;
const MAX_IMAGE_COUNT = 3;

const Dropzone = () => {
  const [uploading, setUploading] = useState<boolean>(false);

  const typeValidator = (file: File): FileError | null => {
    if (file.size > MAX_FILE_SIZE_BYTES) {
      return {
        code: "size-too-large",
        message: `Image file is larger than ${MAX_FILE_SIZE_MB}MB.`,
      };
    }
    return null;
  };

  const onDrop = async (
    acceptedFiles: File[],
    rejectedFiles: FileRejection[]
  ) => {
    if (rejectedFiles.length > 0) {
      alert(
        `You're trying to upload a file larger than ${MAX_FILE_SIZE_MB}MB. Please try again.`
      );
      return;
    }

    setUploading(true);

    try {
      const formData = new FormData();
      acceptedFiles.forEach((file) => formData.append("file", file));

      await uploadImageToS3(formData, {
        bucket: process.env.NEXT_PUBLIC_BUCKET!, 
        key: `${Date.now()}`, 
      });

      alert("Files uploaded successfully!");

    } catch (error) {
      console.error("Error uploading image to S3:", error);
      alert("Failed to upload image to S3. Please try again.");
    } finally {
      setUploading(false);
    }
  };

  const { getRootProps, getInputProps, isDragActive } = useDropzone({
    onDrop,
    validator: typeValidator,
    accept: {
      "image/jpeg": [],
      "image/png": [],
      "image/webp": [],
      "image/jpg": [],
    },
    maxSize: MAX_FILE_SIZE_BYTES,
    maxFiles: MAX_IMAGE_COUNT,
  });

  return (
    <div>
      <div
        {...getRootProps()}
        className="border-2 border-dashed border-slate-200 rounded-lg hover:bg-slate-100/50 cursor-pointer duration-200 p-8 text-center"
      >
        <input {...getInputProps()} />
        {isDragActive ? (
          <p>Drop the files here ...</p>
        ) : (
          <p className="text-slate-400">
            {`Drag and drop some files here, or click to select files (up to ${MAX_IMAGE_COUNT} images, max ${MAX_FILE_SIZE_MB}MB each)`}
          </p>
        )}
      </div>

      {uploading && <p>Uploading...</p>}
    </div>
  );
};

export default Dropzone;
Enter fullscreen mode Exit fullscreen mode

Explanation:

Validation: We validate file types and sizes. If a file is too large, an error message is displayed.

Handling Uploads: Files are uploaded using uploadImageToS3, a function that sends files to the S3 bucket via a server action we set up earlier.

Feedback: We provide feedback to the user, like showing an "Uploading..." message while files are being uploaded.

The Image Container Component

The ImageContainer component displays each uploaded image along with a "Delete" button. This button allows users to delete images from the S3 bucket.

Here's the code:

export function ImageContainer({ keyProp, url }: Props) {
  return (
    <div key={keyProp} className="relative h-48 w-full">
      <Image
        src={url}
        alt={keyProp}
        layout="fill"
        objectFit="cover"
        className="rounded-lg relative"
      />
      <button
        onClick={() =>
          deleteImageFromS3({
            Bucket: process.env.NEXT_PUBLIC_BUCKET!,
            Key: keyProp

          })
        }
        className="absolute bg-red-500 rounded-full left-2 top-2 p-1 px-4 text-sm text-white font-medium hover:bg-red-600 duration-200"
      >
        Delete
      </button>
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

Explanation:

Image Display: Each image is displayed using the next/image component, which optimizes image loading in Next.js.

Delete Functionality: The "Delete" button triggers deleteImageFromS3, a function that removes the image from the S3 bucket.

The Final Page: Page.tsx

Finally, the Page.tsx file ties everything together. It displays the Dropzone component and the images that have been uploaded.

Here's how it looks:

import Dropzone from "../components/Dropzone";
import { getAllObjectsSignedUrls } from "@/services/s3";
import { ImageContainer } from "../components/ImageContainer";

export default async function Home() {  
  const imageUrls = await getAllObjectsSignedUrls({
    Bucket: process.env.NEXT_PUBLIC_BUCKET!,
  });

  return (
    <main className="max-w-[800px] m-auto p-8">
      <Dropzone />
      <div className="grid grid-cols-3 gap-4 mt-8">
        {imageUrls.map((image) => (
          <ImageContainer key={image.key} keyProp={image.key} url={image.url} />
        ))}
      </div>
    </main>
  );
}
Enter fullscreen mode Exit fullscreen mode

Explanation:

Dropzone Component: This is where users upload images.

Displaying Images: After uploading, images are fetched from S3 and displayed using the ImageContainer component.

Fetching Images: We use getAllObjectsSignedUrls to get the URLs of all images stored in the S3 bucket. This function was defined in the services file earlier.

How to Improve the Code

As this blog post was created for educational purposes and I was relatively new to AWS at the time it was written, in a real-world application, you should implement the following improvements:

Singleton Client for the S3 class: Instead of instantiating a new S3 client for each operation, you should create a singleton instance of the S3 class. This avoids unnecessary overhead and improves performance.

More robust Typescript types: Enhancing TypeScript types will help prevent potential errors and provide a better developer experience. You can define types for common operations like file upload and metadata handling, ensuring better code quality and maintainability.

Security Enhancements

Security. Image provided by Unsplash.

IAM Roles & Policies: Ensure that the S3 bucket's permissions are correctly set up using IAM roles, allowing only authorized users and services to interact with it.

Server-Side Encryption: Consider enabling server-side encryption to ensure your files are encrypted at rest.

Presigned URLs: If your app allows file uploads or downloads from the client, use presigned URLs to provide secure temporary access.

Image Compression: Consider using a library like Sharp to compress the users' images and save storage and resources.

Conclusion

Uploading images to AWS S3 with Next.js and React Dropzone offers an efficient and scalable solution for handling media uploads in web applications. This guide covered the basic steps to integrate S3 into your project, including file uploads, error handling, and using Next.js Server Actions.

By implementing the recommended improvements, such as creating a singleton S3 client, leveraging stronger TypeScript types, and enhancing security, you can ensure your application is both performant and secure in production environments.

Thanks for reading, and I wish you the best in your AWS and development journey!😊

Top comments (1)

Collapse
 
vince_hirefunnel_co profile image
Vince Fulco (It / It's)

Would like to see the github repo for this...