DEV Community

Ankur Sheel
Ankur Sheel

Posted on • Originally published at ankursheel.com on

How to Upload Multiple Files to AWS S3 using Terraform

Problem

I want to upload multiple files from a specific folder to an AWS S3 bucket.

Assumptions

The S3 bucket name is test.

The directory structure is as follows.

documents
|- file_1
|- subdirectory1
| |- file_1_1
| |- file_1_2
| |- subdirectory2
| | |- file_1_2_1
Enter fullscreen mode Exit fullscreen mode

We want to end up with the following S3 objects.

  • s3://test/file_1
  • s3://test/subdirectory1/file_1_1
  • s3://test/subdirectory1/file_1_2
  • s3://test/subdirectory1/subdirectory2/file_1_2_1

Solution

resource "aws_s3_bucket_object" "test" {
  for_each = fileset("./documents/", "**")
  bucket = "test"
  key = each.value
  source = "./documents/${each.value}"
  etag = filemd5("./documents/${each.value}")
}
Enter fullscreen mode Exit fullscreen mode
  • Line 1: : Create an S3 bucket object resource.
  • Line 2: : Use a for_each argument to iterate over the documents returned by the fileset function. for_each identifies each instance of the resource by its S3 path, making it easy to add/remove files. The fileset function enumerates over a set of filenames for a given path. It uses ** as the pattern for a recursive search.
  • Line 3: : The name of the bucket to put the files in.
  • Line 4: : The object’s name once it’s in the bucket. In the example above, it is the same as the path.
  • Line 5: : the Path to the file to be uploaded.
  • Line 6: : Triggers an update only if the file changes. The eTag of each object is an MD5 hash of that object.

References

Top comments (0)