DEV Community

Cover image for File Upload with Google Cloud Storage and Node.js
Kinanee Samson
Kinanee Samson

Posted on

File Upload with Google Cloud Storage and Node.js

With the data collection rate currently going through the roof, the chances that you will build an app requiring users to upload one or multiple files will also be through the roof. There are many solutions to this problem, there are tons of services out there focused on making this process as smooth as possible. We have names like Firebase storage, Supabase storage, and Cloudinary.

However, in today's post, we will consider Google Cloud Storage (GCS), a cloud storage service provided by Google via GCP, spoiler alert, this is what powers Firebase Storage underhood, Google Cloud Storage is a Cloud managed service for storing unstructured data. You can store any amount of data and retrieve it as often as you like

In today's post, we will see how we can handle file uploads to Google Cloud Storage using Node.js, and here are the following talking points;

• Project Setup
• Build an express server
• processing uploaded files with Multer
• Handling file upload with GCS

Project Setup

The first thing you need to do is to head over to Google Cloud and register a new account if you do not already have one. If you do then you're part of the bandwagon of developers with unfinished projects. If you have registered successfully or you have an account the next thing is to navigate to the cloud storage on your Google cloud console. We need to create a new bucket, take note of the name of this bucket we'll use it later. Now we need to set up a Node.js project open up your terminal and navigate into your projects directory, create a folder to serve as the current project directory say, gc_node and navigate into the newly created folder.

Now we need to generate a package.json file.

npm init --y
Enter fullscreen mode Exit fullscreen mode

Now we are going to install some dependencies for the project with the following command.

npm i express multer @google-cloud/storage cors nodemon
Enter fullscreen mode Exit fullscreen mode

These are the dependencies we'll need to get our server up and running now let's create a basic express server. This will serve as the entry point for processing files stored on GCS. Inside the gc_node directory let's make a new folder, src which will house all of our working files.

gc_node
├── src                  
│   ├── index.ts
Enter fullscreen mode Exit fullscreen mode

Now let's open up our index file and edit it accordingly;

const express = require('express');
const cors = require('cors');

const app = express()

app.use(cors())

app.listen(3000, () => {
  console.log('server running on port 3000');
});
Enter fullscreen mode Exit fullscreen mode

Let's add the following script to our package.json file.

{
"scripts": {
    "dev": "nodemon ./src/index.js",
    "start": "node ./src/index.js"
  },
}
Enter fullscreen mode Exit fullscreen mode

from the terminal, we need to run the dev command to serve our project locally;

npm run dev
Enter fullscreen mode Exit fullscreen mode

You'll see that our project is running on port 3000 if everything is done correctly. Now let up a route for processing uploaded files.

//  gc_node/index.ts

const multer = require('multer');
const upload = multer({ storage: multer.memoryStorage() });

// cont'd

app.post('/upload', upload.array("images"), (req, res) => {
  const imageMimeTypes = ["image/png", "image/jpeg", "image/jpg", "image/svg"];

  const files = req.files as any[];

  files.forEach((file) => {
    const mimeType = imageMimeTypes.find((mT) => mT === file.mimetype);

    if (file && !mimeType) {
      return res.status(400).json({ message: "Only images allowed!" });
    }
  });

// more on here later

})
Enter fullscreen mode Exit fullscreen mode

We have used the multer middleware to parse any files sent along with the request and we are expecting to receive an array of files. We'll define a helper function that will process all uploaded files before we send them off to Google.

//  gc_node/index.ts
boostrapFile(file) {
    const key = crypto.randomBytes(32).toString("hex");

    const [, extension] = file.originalname.split(".");

    const uploadParams = {
      fileName: `${key}.${extension}`,
      Body: file.buffer,
    };

    return { uploadParams, key, extension };
  }

// .... continued
Enter fullscreen mode Exit fullscreen mode

In summary, the function above takes a file object, generates a unique key, creates a new filename with the key and original extension, prepares the file data for upload, and returns all the necessary information for upload processing. Let's upload our file now;

//  gc_node/index.ts

import { Storage } from "@google-cloud/storage";

const storage = new Storage({
  projectId: process.env.GOOGLE_PROJECT_ID,
  keyFilename: path.join(__dirname + "../key.json"),
});

async function uploadFile(
    bucketName: string,
    buffer: Buffer | string,
    destFileName: string
  ) {
    const file = await storage.bucket(bucketName).file(destFileName);
    await file.save(buffer);
    const message = `file uploaded to ${bucketName}!`;
    const [publicUrl] = await file.getSignedUrl({
      expires: new Date("12/12/3020"),
      action: "read",
    });
    return { message, publicUrl };
  }

// cont'd
Enter fullscreen mode Exit fullscreen mode

The code snippet above defines an asynchronous function called uploadFile that uploads a file to Google Cloud Storage (GCS) and potentially returns a publicly accessible URL. First, we import the Storage class from the @google-cloud/storage library, which provides functionalities to interact with GCS. Then we create a new Storage instance using configuration options. It retrieves the projected for the project on Google Cloud from the environment variable process.env.GOOGLE_PROJECT_ID and uses the path path.join(__dirname + "../key.json") to specify the location of a service account key file. This key file is required for GCS authentication.

The uploadFile function accepts three parameters, the first is the name of the bucket we want to upload the file to, and the second is the array buffer which stores the content of the file we want to upload, the third argument is the name for the file. Inside the function, we create a reference to the file object within the specified bucket using the destFileName. Then we call await file.save(buffer); This asynchronous call uploads the file data (either from the buffer) to the GCS file object. Next, we create a success message indicating the file was uploaded to the specified bucket. The next line retrieves a publicly accessible URL for the uploaded file. It uses the getSignedUrl method with configuration options; expires sets an expiration date for the URL (here, a very distant date "12/12/3020"), and action specifies the allowed action on the URL, which is set to "read" for public read access.
Then return an object containing the success message and the public URL (if generated). We will use this function inside our route handler function to help us upload the file to GCS.

//  gc_node/index.ts


// cont'd

app.post('/upload', upload.array("images"), (req, res) => {
  // cont'd
  const files = req.files as any[];
  const responses = []

  // cont'd
  for (const file of files) {
      const {
        uploadParams: { Body },
        extension,
        key,
      } = storageService.boostrapFile(file);
      const response = await uploadFile(
        process.env.BUCKET_NAME,
        Body,
        `/photos/${extension}.${key}`
      );
      response.push(response)
    }
    console.log(responses);
    return res.json(responses);
})
Enter fullscreen mode Exit fullscreen mode

This route handler allows uploading multiple files, prepares data for each file using the boostrapFile function, and then uploads each file to GCS using the uploadFile function. Finally, it returns a JSON response containing information about each uploaded file.

You can go ahead and use this next time you want to set up uploads to GSC, what are your thoughts on the post? Do you personally use GCS for handling file storage? Or have you at any point in time used GCS? What was the experience like and what are your thoughts on using GCS? I would like to know all these and more, so leave your thoughts below using the comment section. I hope you found this useful and I will see you in the next one.

Top comments (0)