DEV Community

Cover image for Large file uploads to an S3 bucket done neatly in Laravel
Adam Crampton
Adam Crampton

Posted on

Large file uploads to an S3 bucket done neatly in Laravel

Lately I've been looking at ways to reduce the amount of clutter in my Laravel controllers, and found macros can be super useful for wrapping reusable snippets of code behind an easily accessible facades.

If you'd like to learn more about Laravel macros, here's a great primer, with a list of "Macroable" classes:
https://tighten.co/blog/the-magic-of-laravel-macros

Dealing with large files

The best way to reliably upload large files to an S3 bucket in Laravel is with the Flysystem S3 adapter, specifically using the writeStream and putStream methods.

Special note: You are almost certainly going to need to tweak the host's php.ini configuration, specifically:

  • post_max_size
  • upload_max_filesize

Setting up

The first thing I would recommend, if you don't do this already, is to set up a Service Provider specifically for your macros. To do this:

  • Run php artisan make:provider MacroServiceProvider
  • Add App\Providers\MacroServiceProvider::class to your project's app.php config file

Next, install the Flysystem AWS S3 Adapter:
composer require league/flysystem-aws-s3-v3

Finally, you'll want to ensure these four values are set in your project's .env file:

  • AWS_ACCESS_KEY=YOURACCESSKEY
  • AWS_SECRET_ACCESS_KEY=YOURSECRETACCESSKEY
  • AWS_REGION=aws-bucket-region
  • AWS_S3_BUCKET=name.of.s3.bucket

Create the macro

For this macro, we are going to extend the built in File component so we can use its facade.

Within App\Providers\MacroServiceProvider, let's add a closure to the boot method with four parameters:

  • path: Path within the bucket to save the file
  • filename: Filename you want to save as
  • file: The file object from $request->file('input_name')
  • overWrite: Whether or not you want to overwrite the file if a file with the same name already exists

Here's the code:

<?php

namespace App\Providers;

use Aws\S3\S3Client;
use Illuminate\Support\ServiceProvider;
use Illuminate\Support\Facades\Config;
use Illuminate\Support\Facades\File;
use League\Flysystem\AwsS3v3\AwsS3Adapter;
use League\Flysystem\Filesystem;

class MacroServiceProvider extends ServiceProvider
{
    /**
     * Bootstrap services.
     *
     * @return void
     */
    public function boot()
    {
        File::macro('streamUpload', function($path, $fileName, $file, $overWrite = true) {
            // Set up S3 connection.
            $resource = fopen($file->getRealPath(), 'r+');
            $config = Config::get('filesystems.disks.s3');
            $client = new S3Client([
                'credentials' => [
                    'key'    => $config['key'],
                    'secret' => $config['secret'],
                ],
                'region' => $config['region'],
                'version' => 'latest',
            ]);

            $adapter = new AwsS3Adapter($client, $config['AWS_S3_BUCKET'], $path);
            $filesystem = new Filesystem($adapter);

            return $overWrite 
                    ? $filesystem->putStream($fileName, $resource) 
                    : $filesystem->writeStream($fileName, $resource);
        });
    }
}

Enter fullscreen mode Exit fullscreen mode

Update your controller

Now all you need to do is grab the file from the request, set your options, and call the macro:

            // Set file attributes.
            $filepath = 'upload/destination/files';
            $file = $request->file('uploaded-file');
            $filename = $request->input('name'); // Hidden input with a generated value

            // Upload to S3, overwriting if filename exists.
            File::streamUpload($filepath, $filename, $file, true);

Enter fullscreen mode Exit fullscreen mode

Aaand that's it! Happy uploading!

Oldest comments (5)

Collapse
 
jasperf profile image
Jasper Frumau

Thanks a lot for this Adam. Going to try this out with Spatie Laravel Backups being sent to Digital Ocean Spaces. Any tips for combining multiple PUT/POST Objects into one key to be sent serialized? This to avoid rate limiting for multiple requests besides the now sometimes rate limiting taking place due to too long a request with large files we may remedy with your macro..

Collapse
 
adam_crampton profile image
Adam Crampton

Sorry for the slow response!

Personally I like combining data and passing JSON between endpoints, as Laravel provides pretty good tools for dealing with this.

e.g. You can easily convert a model collection to JSON, then have it reconstructed back to a collection object of that model class using Laravel's hydrate method (really handy when passing data back and forth between APIs, Redis, etc).

Hope that at least sort-of answers your question :)

Collapse
 
jasperf profile image
Jasper Frumau

Thanks Adam. Models can be converted to JSON with ease, true. I was more looking into have several images added (PUT) and or loaded (GET) from object storage. Seems I need to combine them and store them as JSON with images as base64 perhaps in one large object and then pull in and split again to load them. Perhaps an object array of images tied to one key. Just not enough experience yet.

Not sure if that is the way to go anymore though so have gone back to block storage using Digital Ocean volumes instead of Spaces. Been fighting rate limiting PUTing image files (200 requests per second max/ 150GB per 24 hrs) and retrieving / GETing them for a while now and decided to for now at least move back to server and or volume storage.

If you do know of ways to store images on object storage without surpassing rate limits whether S3 limitations or those of Digital Ocean Spaces do let me know.

Thanks.

Collapse
 
faselahmad0 profile image
Ahmed Faisel

Morning adam
is this solution upload the file first to the server then to amazon server?
if not this will keep errors like
Allowed memory size of XXX bytes exhausted (tried to allocate XXXXX bytes).

I am searching for a solution to fix this error without upgrade the memory size

Collapse
 
adam_crampton profile image
Adam Crampton

Hi there, you may want to try increasing the memory_limit setting in the server's php.ini file to get around this, if that's an option for you.

I'm not sure using a regular upload (as opposed to a stream, shown in this article) will cure the memory issue.

Good luck!