DEV Community

Cover image for Apollo-server-express File Upload on Local Storage and on AWS S3
kingmaker9841
kingmaker9841

Posted on

Apollo-server-express File Upload on Local Storage and on AWS S3

Summary : We will be uploading files from one storage to another. I will use express, apollo-server-express and aws-sdk.

File Upload is fundamental to web-apps. And there is not much information regarding details of file uploads using apollo-server-express for beginner and intermediate. So, anyone reading this post, if you do not understand some code, please feel free to comment and I will update you asap. Now, let us write some code...

First initialize package.json in your working directory using:

npm init -y
Enter fullscreen mode Exit fullscreen mode

Now let us install some module.

npm install express apollo-server-express dotenv nodemon uuid aws-sdk
Enter fullscreen mode Exit fullscreen mode

If you are not familiar with some of the modules, you can google the module name and learn from its documentation.

Open package.json and make some changes as:

"main": "app.js",
  "scripts": {
    "start": "node app.js",
    "dev": "nodemon app.js"
  }
Enter fullscreen mode Exit fullscreen mode

As you can see, app.js is my server file but you can rename it to anything you want.
If your main: property is app.js then go ahead and create a file named app.js in your working directory.Also create a file .env in order to store some environment variables. Now inside app.js, paste the following code:

const express = require('express');
const app = express();
const {ApolloServer} = require('apollo-server-express');
require('dotenv').config();

let server = new ApolloServer({
    typeDefs, resolvers
});
server.applyMiddleware({app});

let PORT = process.env.PORT || 5000;
app.listen(PORT, ()=>{
    console.log(`Server started on ${PORT}...`);
})
Enter fullscreen mode Exit fullscreen mode

This will get our server up and running on port 5000.
Now configure your AWS S3 bucket to get accesskeyid, secretaccesskey, region and bucket-name. Then create a directory named config and inside it create a file s3.js and copy the code below:
s3.js

const aws = require('aws-sdk');

let s3 = new aws.S3({
    credentials: {
        accessKeyId: process.env.ACCESS_KEY_ID,
        secretAccessKey: process.env.SECRET_ACCESS_KEY
    },
    region: process.env.REGION,
    params : {
        ACL : 'public-read',
        Bucket : process.env.AWS_BUCKET
    }
});

module.exports = s3
Enter fullscreen mode Exit fullscreen mode

Here, all the environment variables are kept inside .env file.

Now create a directory named schema and create two files namely typedefs.js and resolvers.js and paste the following code:
typedefs.js

const {gql} = require('apollo-server-express')

const typedefs = gql`
    type Query {
        uploadedFiles : [File]
    }
    type Mutation {
        singleUploadLocal (file: Upload!) : File
        multipleUploadLocal (files: [Upload]!) : [File]
        singleUploadS3 (file : Upload!) : File
        multipleUploadS3 (files : [Upload]!) : [File]
    }
    type File {
        success : String!
        message : String!
        mimetype : String
        encoding : String
        filename : String
        location : String
    }
`;
module.exports = typedefs;
Enter fullscreen mode Exit fullscreen mode

resolvers.js

const fs = require('fs');
const {v4: uuid} = require('uuid');
const s3 = require('../config/s3');

const processUpload = async (file)=>{
    const {createReadStream, mimetype, encoding, filename} = await file;
    let path = "uploads/" + uuid() + filename;
    let stream = createReadStream();
    return new Promise((resolve,reject)=>{
        stream
        .pipe(fs.createWriteStream(path))
        .on("finish", ()=>{

            resolve({
                success: true,
                message: "Successfully Uploaded",
                mimetype, filename, encoding, location: path
            })
        })
        .on("error", (err)=>{
            console.log("Error Event Emitted")
            reject({
                success: false,
                message: "Failed"
            })
        })
    })
}

let processUploadS3 = async (file)=>{
    const {createReadStream, mimetype, encoding, filename} = await file;
    let stream = createReadStream();
    const {Location} = await s3.upload({
        Body: stream,
        Key: `${uuid()}${filename}`,
        ContentType: mimetype
    }).promise();
    return new Promise((resolve,reject)=>{
        if (Location){
            resolve({
                success: true, message: "Uploaded", mimetype,filename,
                location: Location, encoding
            })
        }else {
            reject({
                success: false, message: "Failed"
            })
        }
    })
}
const resolvers = {
    Mutation: {
        singleUploadLocal : async (_, args)=>{
            return processUpload(args.file);
        },
        multipleUploadLocal : async (_, args) =>{
            let obj =  (await Promise.all(args.files)).map(processUpload);
            console.log(obj);
            return obj;
        },
        singleUploadS3 : async (_, args)=>{
            return processUploadS3(args.file);
        },
        multipleUploadS3 : async (_, args)=>{
            let obj = (await Promise.all(args.files)).map(processUploadS3);
            return obj;
        }
    }
}

module.exports = resolvers;
Enter fullscreen mode Exit fullscreen mode

Now to test it out, I am using a chrome extension called Altair that allows me to easily upload file on my graphql query which looks like this:

Alt Text

You can find the code above in my github repo:
https://github.com/kingmaker9841/apollo-multiple-upload

Have a great day!

Top comments (3)

Collapse
 
eriickson profile image
Erickson Manuel Holguín

Great, it works almost perfectly for me, I have been researching the subject for a long time and I had only achieved this asaña with the "apollo-server" package, it had a large part advanced, but with your example I was able to complete the puzzle, let's say I mixed part of your code with mine both in app.js, resolvers.js and in typedefs.js aah and I also had to install the "graphql-upload" module, I frame in red boxes the things that I changed and that I had to take in bill
dev-to-uploads.s3.amazonaws.com/i/...
dev-to-uploads.s3.amazonaws.com/i/...
dev-to-uploads.s3.amazonaws.com/i/...
dev-to-uploads.s3.amazonaws.com/i/...

Collapse
 
ofspain profile image
ofspain

A great tutorial as i followed every bit of it. But my issue is the upload fail to conclude. On Altair, it just kept displaying loading forever. The uploaded images were stored like an empty text file in the upload folder. The extension and file name though are correct. Please i will appreciate any help on this.

Collapse
 
himanshutecstub profile image
himanshutecstub

It's not working for node version > 12