A comprehensive guide on how to upload files with Apollo-server 2.0 and Mongodb.
...
Prerequisites
- Altair (Recommended alternative to the default graphql playground)
- You must have nodejs installed on your machine.
File Uploads have an interesting history in the Apollo ecosystem.
With Apollo Server 2.0, you can perform file uploads right out of the box. Apollo Server ships with the ability to handle multipart requests that contain file data. This means you can send a mutation to Apollo Server containing a file, pipe it to the filesystem, or pipe it to a cloud storage provider instead.
Depending on your problem domain and your use case, the way you set up file uploads may differ. Handling Multipart upload request in graphql can be a pain in the ass especially when your are coming from a Rest background like me. However, I'm going to show you how to upload files with apollo-server 2.0
One of the simplest ways of achieving file uploads in a single request is to base64-encode a file and send as a string variable in a mutation.
How it works
The upload functionality follows the GraphQL multipart form requests specification. Two parts are needed to make the upload work correctly. The server and the client:
The Client: On the client, file objects are mapped into a mutation and sent to the server in a multipart request.
The Server: The multipart request is received. The server processes it and provides an upload argument to a resolver. In the resolver function, the upload promise resolves an object.
Our project structure
├── images
│ └── 9A1ufNLv-bg-works.jpg
├── package.json
└── src
├── db.js
├── fileModel.js
├── index.js
├── resolvers.js
└── typeDefs.js
Lets Begin 🚀
We will start off by initializing our project with npm, install the necessary packages and configure our server.
npm init -y
yarn add esm apollo-server graphql mongoose shortid
yarn add -D nodemon
I'm going to explain what each package will handle shortly.
The next step is to setup our server with apollo and mongoose. Create a db.js file in your /src directory and add the following configuration code to connect to your mongodb database:
import mongoose from "mongoose";
const MONGO_CONNECTION = "mongodb://localhost:27017/fileUploads";
export default (async function connect() {
try {
await mongoose.connect(MONGO_CONNECTION, {
useNewUrlParser: true,
useUnifiedTopology: true,
});
} catch (err) {
console.error(err);
}
})();
Now create an index.js
file in your /src directory and paste the following code start off your server on http://localhost:4000
import { ApolloServer } from "apollo-server";
import typeDefs from "./typeDefs";
import resolvers from "./resolvers";
// Import your database configuration
import connect from "./db";
export default (async function () {
try {
await connect.then(() => {
console.log("Connected 🚀 To MongoDB Successfully");
});
const server = new ApolloServer({
typeDefs,
resolvers,
});
server.listen(4000, () => {
console.log(`🚀 server running @ http://localhost:4000`);
});
} catch (err) {
console.error(err);
}
})();
Next we'll create our resolvers and typeDefs and put it in a separate file:
// src/typeDefs.js
import { gql } from "apollo-server";
export default gql`
type Query {
hello: String
}
`;
// src/resolvers.js
export default {
Query: {
hello: () => "Hello world",
},
};
Lol 😅 thats just a simple Hello world Query.
Now add a dev script to your package.json file to enable us start up our server.
You may be wondering why we've been using ES6 syntax without configuring babel and thats because of the esm module we installed earlier.
esm is the world’s most advanced ECMAScript module loader. This fast, production ready, zero dependency loader is all you need to support ECMAScript modules in Node 6+.
// package.json
{
"name": "apollo-upload",
"main": "src/index.js",
"scripts": {
"dev": "nodemon -r esm src/index.js" /* we are requiring the esm module
with [-r] flag to transpile our es6 code */
},
"dependencies": {
"apollo-server": "^2.11.0",
"graphql": "^14.6.0",
"mongoose": "^5.9.4",
"esm": "^3.2.25",
"shortid": "^2.2.15"
},
"devDependencies": {
"nodemon": "^2.0.2"
}
}
yarn dev
#or
npm run dev
We can see that out server is running on http://localhost:4000. Let's test our Hello world query in out graphql playground.
For server integrations that support file uploads (e.g. Express, hapi, Koa), Apollo Server enables file uploads by default. To enable file uploads reference the Upload type in the schema passed to the Apollo Server construction.
Now your typeDefs file should look exactly like this:
// src/typeDefs.js
import { gql } from "apollo-server";
export default gql`
type File {
id: ID!
filename: String!
mimetype: String!
path: String!
}
type Query {
hello: String
files: [File!]
}
type Mutation {
uploadFile(file: Upload!): File
}
`;
Note: When using typeDefs, Apollo Server adds scalar Upload to your schema, so any existing declaration of scalar Upload in the type definitions should be removed. If you create your schema with makeExecutableSchema and pass it to ApolloServer constructor using the schema param, make sure to include scalar Upload.
The server is going to return a rpomise that resolves an object. The object contains the following:
- createReadStream: The upload stram manages straming the file(s) to a filesystemor any storage location of your choice.
- filename: The original name of the uploaded file(s)
- mimetype: The MIME type of the file(s) such as image/jpg, application/json, etc.
- encoding: The file encoding i.e UTF-8
Now we are going to create a function that will process our file and pipe it into a directory.
// src/resolvers.js
import shortid from "shortid";
import { createWriteStream, mkdir } from "fs";
import File from "./fileModel";
const storeUpload = async ({ stream, filename, mimetype }) => {
const id = shortid.generate();
const path = `images/${id}-${filename}`;
// (createWriteStream) writes our file to the images directory
return new Promise((resolve, reject) =>
stream
.pipe(createWriteStream(path))
.on("finish", () => resolve({ id, path, filename, mimetype }))
.on("error", reject)
);
};
const processUpload = async (upload) => {
const { createReadStream, filename, mimetype } = await upload;
const stream = createReadStream();
const file = await storeUpload({ stream, filename, mimetype });
return file;
};
export default {
Query: {
hello: () => "Hello world",
},
Mutation: {
uploadFile: async (_, { file }) => {
// Creates an images folder in the root directory
mkdir("images", { recursive: true }, (err) => {
if (err) throw err;
});
// Process upload
const upload = await processUpload(file);
return upload;
},
},
};
For the demo below i'm going to use Altair which is a graphql playground and it's very efficient for file uploads.
Saving to database(mongodb)
We used file system to handle our file uploads because of the following reasons:
Performance can be better than when you do it in a database. To justify this, if you store large files in DB, then it may slow down the performance because a simple query to retrieve the list of files or filename will also load the file data if you used Select * in your query. In a file system, accessing a file is quite simple and light weight.
Saving the files and downloading them in the file system is much simpler than it is in a database since a simple "Save As" function will help you out. Downloading can be done by addressing a URL with the location of the saved file.
Migrating the data is an easy process. You can just copy and paste the folder to your desired destination while ensuring that write permissions are provided to your destination. ...Read more
In the future i'm going to show you how to query the files from our images directory through the file path specified in the database.
We are going to create our database schema and save it in a src/fileModel.js
file.
Your code should look like:
// src/fileModel.js
import { Schema, model } from "mongoose";
const fileSchema = new Schema({
filename: String,
mimetype: String,
path: String,
});
export default model("File", fileSchema);
Next step is to make use our file schema.
Your src/resolvers.js code should look like this:
// src/resolvers.js
import shortid from "shortid";
import { createWriteStream, mkdir } from "fs";
// import our model
import File from "./fileModel";
const storeUpload = async ({ stream, filename, mimetype }) => {
const id = shortid.generate();
const path = `images/${id}-${filename}`;
return new Promise((resolve, reject) =>
stream
.pipe(createWriteStream(path))
.on("finish", () => resolve({ id, path, filename, mimetype }))
.on("error", reject)
);
};
const processUpload = async (upload) => {
const { createReadStream, filename, mimetype } = await upload;
const stream = createReadStream();
const file = await storeUpload({ stream, filename, mimetype });
return file;
};
export default {
Query: {
hello: () => "Hello world",
},
Mutation: {
uploadFile: async (_, { file }) => {
mkdir("images", { recursive: true }, (err) => {
if (err) throw err;
});
const upload = await processUpload(file);
// save our file to the mongodb
await File.create(upload);
return upload;
},
},
};
Complete code https://github.com/DNature/apollo-upload
Now you now understand how file uploads work in Apollo server 2.0. I hope to see you next time 😀.
You can also check out the part 2 where you will learn how to upload files to this server we've built
Checkout some blog posts i've written on my website
Top comments (28)
when i select file in frontend .the server crashed by it .
can you help me with it.
I have currently node js v14.17.0. thanks a lot.
RangeError: Maximum call stack size exceeded
at ReadStream.deprecated as open
at ReadStream.open (C:\Projects\pro\apollo-upload\node_modules\fs-capacitor\lib\index.js:90:11)
at _openReadFs (internal/fs/streams.js:117:12)
at ReadStream. (internal/fs/streams.js:110:3)
at ReadStream.deprecated as open
at ReadStream.open (C:\Projects\pro\apollo-upload\node_modules\fs-capacitor\lib\index.js:90:11)
at _openReadFs (internal/fs/streams.js:117:12)
at ReadStream. (internal/fs/streams.js:110:3)
at ReadStream.deprecated as open
at ReadStream.open (C:\Projects\pro\apollo-upload\node_modules\fs-capacitor\lib\index.js:90:11)
at _openReadFs (internal/fs/streams.js:117:12)
at ReadStream. (internal/fs/streams.js:110:3)
at ReadStream.deprecated as open
at ReadStream.open (C:\Projects\pro\apollo-upload\node_modules\fs-capacitor\lib\index.js:90:11)
at _openReadFs (internal/fs/streams.js:117:12)
at ReadStream. (internal/fs/streams.js:110:3)
[nodemon] app crashed - waiting for file changes before starting...
Hi, Please provide a link to your code?
actually the version of node js (v14) that i have does not support that .but i switch it to version 12. now it works.
Great!, If it works, don't touch it 😆
can you make example about multi file Upload .
I appreciate it .
Hi, here's how I would go about it.
1: Modify the
uploadFile
mutation to accept an array of files2: replace your upload file resolver with this:
thanks a lot bro. it works.
have a nice day .
I'm glad it worked :)
Enjoy your day!
This is the start of a comprehensive and nicely written two-piece tutorial covering both the front and back end of file uploads with GraphQL. Divine Hycenth also shows how to overcome the limitation of plain apollo-server in that it does not easily allow downloading of the uploaded files. I can only say that this was super helpful and give my compliments on the quality of the write-up.
Thank you Divine for the in-depth tutorial and in regards to "In the future i'm going to show you how to query the files from our images directory through the file path specified in the database." ..... Please make the future soon :D
Here it is. dev.to/dnature/file-uploads-with-r...
Sorry for the late reply as i've been busy with so many other things.
Once again, thanks for your comment. It means a lot ot me :)
To add to this, there is very scarce information on how to properly upload a profile image for a user, perhaps this could be integrated into the tutorial? Thanks again for your effort.
Hello, thank you very much for your contribution, but by following your recommendations as I have this problem on my console.
internal/fs/streams.js:150
function _openReadFs(stream) {
^
RangeError: Maximum call stack size exceeded
I guess you're missing something somewhere. You can have a look at this repository to see how i did it. github.com/DNature/apollo-upload
Let me know if you still encounter another problem in the future.
Also, it's better to share your code via codesandbox or create a repository when trying to ask for help because it's difficult to debug a code you're can't see.
Best regards
Hello, i had the same error, i just clone your repos => npm i=> npm run dev & npm start.
the file is correctly uploaded to the folder but nothing in the DB and this error appears in console.
ok found where the problem come from for me. The f****g node version.
I don't have this error with node v12..*
Thanks to Divine and do the node update to v14 for now :)
Cheers Thomas..
I might consider doing an upgrade to node v14 when it becomes stable. However, I tried node v14 months ago and i was having some issues with it so i won't recommend using node v14 until it becomes a LTS release.
Hello my friend, your explanations served me wonderfully, they work perfectly :) Thank you very much, but the problem lies when I migrate this to my express server that has apollo-server-express I add "app.use (" / graphql ", graphqlUploadExpress ({maxFileSize: 10000000, maxFiles: 10}));
"to connect it with express, but the request doesn't even reach the server, the frontend just shows me an error saying" Error: Network error: Unexpected token <in JSON at position 0 "
Hi Erick. I'll be happy to help you debug it.
Hit me up on twitter twitter.com/DivineHycenth
there isnt much resources about this topic so thank you very much for sharing this!!
I'm glad you found this useful.
const { filename, mimetype, encoding, createReadStream } = await file;
console.log(filename);
try {
const stream = createReadStream();
console.log(stream);
} catch (error) {
throw error;
}
return { filename, mimetype, encoding };
this code is giving me Error : createReadStream() is not a function
it's not working for me the file is 0kby
What issues did you encounter?.
Be sure to read through carefully.
I sincerely appreciate everyone for taking their time to give me a thumbs up. It means a lot to me :)
I keep getting a "createReadStream() is not a function".... I'm fairly new to backend, and I'm not sure what the issue could be