In one of my recent projects, I was working on this really cool thing called an Order Management System (OMS). Basically, it's this centralized platform that helps NGOs automate all sorts of stuff - like managing their programs, applications, staff, students, volunteers, emails, files, and forms.
Anyway, at one point in the project, I had to figure out how to handle file uploads to Cloudinary for a particular endpoint. My initial solution worked, but it was a little slow - it took about a while to upload five files at a time. As a developer, I knew I could do better, so I came up with a new solution that worked way faster.
If you're using Multer, which is a really handy tool for uploading files in Node.js, here's how you can use it to upload a file to Cloudinary:
import cloudinary from "./cloudinary";
const { upload } = cloudinary.uploader;
const uploadFile = async (file: Express.Multer.File) => {
const uploadedFile = await upload(file.path, {
folder: "/upload-folder/files",
public_id: file.fieldname,
overwrite: true,
});
return uploadedFile.secure_url;
};
Once you've uploaded a file, you can save the result in a database. To do this, use the file's fieldname (e.g. "profileImage") as the key, and the secure_url as the value.
However, sometimes you might need to upload multiple files at once. For example, if you have three file upload inputs on a page, and one of them lets the user upload up to three files, you'll have to handle a total of five files. In that case, you can use the name attribute of the input to identify which files belong together. So if the input for uploading three files is named "supportingDocuments", and the other two inputs are named "profileImage" and "resume", you can use those names to keep track of which files were uploaded where.
Here's an example of what req.files
might look like, given the three file inputs mentioned above:
[
{
"fieldname": "supportingDocuments",
"originalname": "document1.pdf",
"encoding": "7bit",
"mimetype": "application/pdf",
"size": 1234567,
"buffer": "<Buffer 25 50 44 46 2d 31 2e 33 ... >"
},
{
"fieldname": "supportingDocuments",
"originalname": "document2.jpg",
"encoding": "7bit",
"mimetype": "image/jpeg",
"size": 345678,
"buffer": "<Buffer ff d8 ff e0 00 10 ... >"
},
{
"fieldname": "supportingDocuments",
"originalname": "document3.docx",
"encoding": "7bit",
"mimetype": "application/vnd.openxmlformats-officedocument.wordprocessingml.document",
"size": 987654,
"buffer": "<Buffer 50 4b 03 04 0a 00 ... >"
},
{
"fieldname": "profileImage",
"originalname": "profile.jpg",
"encoding": "7bit",
"mimetype": "image/jpeg",
"size": 234567,
"buffer": "<Buffer ff d8 ff e0 00 10 ... >"
},
{
"fieldname": "resume",
"originalname": "resume.pdf",
"encoding": "7bit",
"mimetype": "application/pdf",
"size": 876543,
"buffer": "<Buffer 25 50 44 46 2d 31 2e 33 ... >"
}
]
So, if we have three files uploaded in the "supportingDocuments" field, and one file each in the "profileImage" and "resume" fields, how can we upload these files and save the results in a database?
Here's how:
import { UploadApiResponse } from "cloudinary";
import cloudinary from "./cloudinary";
type Files =
| Express.Multer.File[]
| { [fieldname: string]: Express.Multer.File[] };
async function uploadFiles(
files: Files
): Promise<{ [key: string]: string | string[] }> {
if (!Array.isArray(files)) return {};
const results: { [key: string]: string | string[] } = {};
const fieldNames: string[] = [];
const fieldNamesOccurrences: { [fieldname: string]: number } = {};
const uploads = files.map((file, i) => {
fieldNames.push(file.fieldname);
return cloudinary.uploader.upload(file.path, {
folder: "/upload-folder/files",
public_id: file.fieldname + i,
overwrite: true,
});
});
const uploadResults: UploadApiResponse[] = await Promise.all(uploads);
for (let i = 0; i < fieldNames.length; i++) {
const fieldName = fieldNames[i];
const uploadedFile = uploadResults[i];
fieldNamesOccurrences[fieldName] =
(fieldNamesOccurrences[fieldName] || 0) + 1;
if (fieldNamesOccurrences[fieldName] > 1) {
if (Array.isArray(results[fieldName])) {
(results[fieldName] as Array<string>).push(uploadedFile.secure_url);
} else if (typeof results[fieldName] === "string") {
results[fieldName] = [
results[fieldName] as string,
uploadedFile.secure_url,
];
} else {
results[fieldName] = uploadedFile.secure_url;
}
} else {
results[fieldName] = uploadedFile.secure_url;
}
}
return results;
}
So, first of all, we need to give the uploadFiles
function an argument called files. (Coming from req.files
). This can either be an array of files or an object with field names and arrays of files. We check for this in the function.
Next, we create some empty objects and arrays to help us keep track of everything. We have an empty object called results
, an empty array called fieldNames
, and another empty object called fieldNamesOccurrences
.
Then, for each file in the files array, we get the name of the field it's associated with and add it to the fieldNames
array. We also use the Cloudinary package to upload each file and push the promise of the upload to an array called uploads
.
We use a special method called Promise.all() to wait for all the file uploads to complete at once. This makes the process much faster! Once all the files are uploaded, we get an array of UploadApiResponse objects called uploadResults
.
Finally, we loop through the fieldNames
array and check how many times each field name appears. If it appears more than once, we add the uploaded file URL to an array in the results object that's associated with that field name. If it only appears once, we simply set the URL as a string associated with that field name in the results object.
And that's pretty much it! The uploadFiles
function helps you upload files using Cloudinary and returns an object that shows you the URL(s) of the uploaded file(s) associated with each field name.
Awesome news right? Once you get the results from the function, you can save them in your database however you want. This means that you can upload to Cloudinary and save files in your database super fast.
That's all folks π. Like and leave your thoughts in the comment section. Feel free to follow me on Twitter.
Top comments (0)