DEV Community

loading...

How Our Users Skip Our Servers and Download Securely Right From Amazon S3

jadeonly profile image Jade Ohlhauser Updated on ・5 min read

Our .NET web application allows users to attach files to things. We use the Amazon Simple Storage Service to store these files. We want users to be able to download a file directly from Amazon instead of passing it through our servers, but we also let our users control which files they can access. Just because a user can see a file today, doesn’t mean they can tomorrow. The solution is pre-signed URLs that we can send to the user’s browser to give them temporary read access to download the file directly from Amazon.

Screenshot

We only use pre-signed URLs for file download. File uploads go through our application so we can get metadata, record the user’s action, and generate a thumbnail if it’s an image.

Diagram

Packages

In Visual Studio go to Tools > NuGet Package Manager > Manage NuGet Packages for Solution… Add the Amazon packages AWSSDK.Core and AWSSDK.S3 to your project.
AWS Packages in VS

Settings

Amazon will need some settings for your account. We use a Web.config, you application may use an App.config file. The AWS SDK knows to load these values just by them being there with the correct names, you don't have to add code to read them.

web.config <appSettings>
<add key="AWSAccessKey" value="xxxxxxxx" />
<add key="AWSSecretKey" value="xxxxxxxx" />
<add key="AWSRegion" value="us-west-1" />

You can find the AWSRegion code for your region from this list.

Methods

We separate the code into two methods. The S3 client generation is used by all our code that interacts with S3 including file upload, download, and pre-signed URL generation.

using Amazon.S3;
using Amazon.S3.Model;
using System;
using System.Collections.Generic;
using System.IO;
using System.Threading.Tasks;

namespace RpmBusinessLayer.Utility
{
    public class AmazonWebServices
    {
        private static IAmazonS3 s3Client;

        private const int READWRITE_TIMEOUT_SECONDS = 300;
        private const int CONNETION_TIMEOUT_SECONDS = 3600;
        private const int MAX_RETRIES = 4;

        private IAmazonS3 GetS3Client()
        {
            if (s3Client == null)
                s3Client = new AmazonS3Client(new AmazonS3Config()
                {
                    SignatureVersion = "4",
                    Timeout = TimeSpan.FromSeconds(CONNETION_TIMEOUT_SECONDS),
                    ReadWriteTimeout = TimeSpan.FromSeconds(READWRITE_TIMEOUT_SECONDS),
                    MaxErrorRetry = MAX_RETRIES
                });

            return s3Client;
        }

And then we have a method in that same file we use to generate the pre-signed URL for a given object.

        public string GeneratePreSignedURLForDownload(string keyName, string fileName)
        {
            try
            {
                GetPreSignedUrlRequest request = new GetPreSignedUrlRequest
                {
                    BucketName = S3Bucket,
                    Key = keyName,
                    Expires = DateTime.Now.AddMinutes(5)
                };
                request.ResponseHeaderOverrides.ContentDisposition = "attachment; filename=\"" + fileName + "\"";

                return GetS3Client().GetPreSignedURL(request);
            }
            catch (AmazonS3Exception ex)
            {
                throw CreateExceptionForS3Exception(ex);
            }
        }

Note the Expires property. That is how you let a user download a file now but not later. The link that is made is independent of our system, for example the user could email it to someone, which is why it is important that it expires.

Redirect to the Download

Our URL expires in 5 minutes, but what if the user has the page open longer than that? We don't want them clicking a download link and getting an error. To solve this, we don't render the pre-signed URLs into the page. Instead we provide links to our FileDOwnload.aspx which then generates the S3 URL on demand and redirects the user to it.

FileAttachment fileAttachment = FileAttachment.Load(rpmUser, fileAttachmentID);

if ( fileAttachment == null )
    TransferToMistakePage(Problem.FileDoesNotExist);

    ResultAndMessage result = fileAttachment.CheckDownloadAccess(rpmUser, forType, forID);

    if (result.Problem != Problem.NoProblem)
        TransferToMistakePage(result.Problem, result.Message);

MiscTools.GetHttpContext().Response.Redirect(fileAttachment.GetPresignedURLFromS3(rpmUser));

This also allows us to record the download event. GetPresignedURLFromS3 generates our path and returns the result of GeneratePreSignedURLForDownload which is the pre-signed URL.

Make it Download as a Download

To make the browser handle the URL as a download where the user can choose to save it the response needs to set content disposition and type headers. By default S3 will return the same headers. That is fine for the type, but we want to set the disposition at download time. First, because that controls the download behavior which we want to vary depending on how we are delivering the file. Second, for the filename. We save the files on S3 with an index filename and apply a custom filename at download.

So to download that example XLSX file as a download
A download in chrome

The response must include this header

response-content-disposition:attachment; filename="0001 Access Records.xlsx"

Which we get from a URL like

https://s3.amazonaws.com/xxxxxxx/123.xlsx?AWSAccessKeyId=xxxxxxx&Expires=1494967927&response-content-disposition=attachment%3B%20filename%3D%220001%20Access%20Records.xlsx%22&Signature=xxxxxxx

The custom response header that you want S3 to use is passed in the request to S3. As a signed request, you can’t just add it to the URL, you need to include it in the signing. This is the important line from GeneratePreSignedURLForDownload

request.ResponseHeaderOverrides.ContentDisposition = "attachment; filename=\"" + fileName + "\"";

Firefox requires the filename to be in quotes. Chrome doesn't care.

Make it Work on Newer Regions

The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.

So your pre-signed URL code is working fine on one region, but when when you go to use of the URLs on a different region you get the authorization error. The problem is some older S3 regions allow you to use an older signature signing method while newer regions require the new methods. There are two potentially confusing wrinkles to this:

  1. You may have been working with files at the newer region just fine and only see this error when you want to use pre-signed URLs. This caught us on deployment.
  2. Just instructing the S3 client to use version 4 of SignatureVersion, you must also specify a region. This is not clearly documented. It was a problem for us because instead of specifying a region, we were using ServiceURL.

The key line is in GetS3Client, not GetPreSignedUrl.

SignatureVersion = "4",

And the region in the web.config

<add key="AWSRegion" value="us-west-1" >

All regions support signature version 4 so you should just always use it and your code will work in any region.

In Closing

Allowing our users to download files directly from S3 provides better download performance for our users and saves on file transfer costs for us. Feel free to reach out to me on twitter. You can learn more about our application at RPM Software.

[Update 2017-05-18] Added the important section "Redirect to the Download".
[Update 2017-09-19] Fixed formatting due to changes to rendering of inline code

Discussion (3)

pic
Editor guide
Collapse
fcpauldiaz profile image
Pablo Díaz Márquez

Great post. It was helpful as a download workflow

Collapse
alexeyzimarev profile image
Alexey Zimarev

Just for the reference, similar thing can be done with Azure Blob Storage using shared access signatures: docs.microsoft.com/en-us/azure/sto...

Collapse
bgadrian profile image
Adrian B.G.

Good find, in the bazzilions documentation files of AWS.