DEV Community

Cover image for Gcp Resumable Upload With Reactjs And Nodejs
Moin Akhter
Moin Akhter

Posted on

Gcp Resumable Upload With Reactjs And Nodejs

In this tutorial, we will learn about resumable upload.

Goals

By the end of this tutorial, you’ll know:

  1. Creating GCP Bucket.
  2. How to upload large files resumable.

Prerequisites

You must have knowledge about javascript and nodejs for this.

Now a day users upload a large file to buckets. Now during uploading, if any error occurs, the user must upload that large file again, which can be irritating for the user.This is where resumable upload comes in to picture.Many service providers like gcp and aws gives ability of resumably upload large files into chunk using something called "signedUrl".

A signedUrl is a kind of url which is valid for a restricted time after that this url will be discard automatically by service provider.

All code will be available on this link.
https://github.com/MOIN-AKHTAR/Resumable-Upload-To-GCP

BACKEND PART:

Before starting our backend we have to install few dependencies.

1. express (For setting up our server).
2. dotenv (To read environment variables from .env file).
3. cors (To resolve cors issue).
4. @google-cloud/storage (This sdk will help us to work with gcp 
                          bucket).
5. nodemon (Install as dev dependency it will auto restart your 
            server whenever any changes occur in your code).
Enter fullscreen mode Exit fullscreen mode

Inorder to work with gcp bucket first you have to create a bucket inside a project.So go to https://console.cloud.google.com/ and create a new project.

Image description

Now follow the steps which i have shown in above image after step 3 you will be redirect to another screen which will ask you above service account details fill the details appropriately than submit the form.After submitting you will get a list of service accounts.
Click on intended service account and go to keys tab than click on Add Key button and download json file.
Now, this file will be used as secret for authentication and to verify whcih account of google developer console you are using.

After that we need to create a bucket for our this newly created project.For creating new bucket follow steps shown in below snapshot.

Image description

After following above steps click on create bucket button.Give some name to your bucket.After creating you gcp bucket you will be redirect to a page where you i'll be able to see a list of buckets you slected project have.

NOTE:- Inorder to get access to uploaded files i have made my bucket as public you can set persmission for your bucket suits your scenario.

https://cloud.google.com/storage/docs/access-control/making-data-public

Inorder to upload files we need to configure our cors config for resumable upload for bucket.These cors config is basically for security purposes.You can follow below code to config your bucket cors.

const { Storage } = require("@google-cloud/storage");

const storage = new Storage({
  keyFilename: "google-storage-key.json",
});

async function configureBucketCors() {
  const [metadata] = await storage
    .bucket(process.env.BUCKET_NAME)
    .getMetadata();
  if (!metadata.cors) {
    await storage.bucket(process.env.BUCKET_NAME).setCorsConfiguration([
      {
        origin: [process.env.OTHER_ORIGIN], //Such as 
 http://localhost:3000
        responseHeader: [
          "Content-Type",
          "Access-Control-Allow-Origin",
          "X-Upload-Content-Length",
          "X-Goog-Resumable",
        ],
        method: ["PUT", "OPTIONS", "POST"],
        maxAgeSeconds: 3600,
      },
    ]);
  }
}

configureBucketCors().catch(console.error);

exports.storage = storage;
Enter fullscreen mode Exit fullscreen mode

Now in above code snippest you must keep responseHeader && method as it is other wise it will not work because these responseHeader and methods are necessary in config.

After this config we need to create an endpoint which will give us a signedUrl which we can use on front end to resumably upload files to ou configured bucket.Now below code snippest is doing that.

app.route("/getSignedUrl").get(async (req, res, next) => {
  try {
    const [url] = await storage
      .bucket(process.env.BUCKET_NAME)
      .file(req.query.fileName)
      .getSignedUrl({
        action: "resumable",
        version: "v4",
        expires: Date.now() + 12 * 60 * 60 * 1000,
        contentType: "application/octet-stream",
      });
    return res.json({
      url,
    });
  } catch (error) {
    return res.status(500).json({
      success: false,
      error,
    });
  }
});
Enter fullscreen mode Exit fullscreen mode

FRONTEND PART:-

Now in front end part i'm not going to deep just giving idea what's we are doing on front end.So here what we have done on front end.

  1. First we added an input which will pick file.
  2. After we selected file we hit our enpoint which will give us a signedUrl.
  3. Using this signed url we will get a session url.
  4. This session url will be used to upload file resumably.
  5. For that i have created a function postNotificationService.This will upload file chunk using session url.
  6. Whenever chunk uploaded successfully we will get a status of 308 which mean that chunk uploaded to bucket successfully but some chunk of file are still remaining.So we have called postNotificationService function recursively.
  7. When whole file will be uploaded successfully we will get status code of 200 and url of uploaded file will be in data.request.responseURL.split("?")[0].

That's it..

Let’s Recap

  1. We learned about what is resumable upload.
  2. What are signedUrl.
  3. Created Google project and also created bucket.
  4. We configured cors for bucket which we will use for resumable upload.
  5. Created an endpoint which will generate a signedUrl which we can use on frontend to resumably upload file to bucket.
  6. On front end we hit signedUrl to get signedUrl.
  7. Using this signedUrl we got session url.
  8. We used session url for resumable upload.
  9. We uploaded file chunk by chunk using session url.
  10. Whenever chunk uploaded successfully and we got status of 308 it's mean that chunk of file uploaded successfully but not the whole uploaded yet.
  11. If we got status code of 200 it's mean whole file uploaded to bucket successfully.
  12. Than data.request.responseURL.split("?")[0] we will get url of whole uploaded file.

Top comments (0)