DEV Community

Cover image for Learn How to Upload Files to Google Cloud Storage Bucket with Nextjs
Derick Zihalirwa
Derick Zihalirwa

Posted on • Updated on

Learn How to Upload Files to Google Cloud Storage Bucket with Nextjs

INTRODUCTION

Shifting from cloudinary to gcs to upload my files in my nextjs application using the power of nextjs api route wasn't an easy task until I found this article

In this article we will try to explore the tread-offs of this implementation and offer insights into the necessary code updates to ensure seamless functionality in both developmental and production environments.

1 GCS Credentials

import { Storage } from "@google-cloud/storage";

const storage = new Storage({
    keyFilename: "KEY_FILENAME.json",
});

...
Enter fullscreen mode Exit fullscreen mode

The code Above suggest we use keyFilename which expose our file for anyone that has access to the repository.

make this change:

import { Storage } from "@google-cloud/storage";

const storage = new Storage({
projectId: "your-project-id",
credentials: {
client_email: process.env.GCS_CLIENT_EMAIL,
private_key: process.env.GCS_PRIVATE_KEY?.split(String.raw`\n`).join("\n"),
},
});
Enter fullscreen mode Exit fullscreen mode

This code has the same functionality as the one provided above, as you can see the only difference is that this one does not access any json file but uses env to store the credentials.

The second code is more convenient if you already have Application Default Credentials set up for your project.

2 UPLOADING FILES

When it came to uploading files to GCS, I introduced a fresh function named method3. As part of this update, I removed the old formidable-serveless.js and parseForm.ts files, opting for a simpler setup. This involved directly weaving the power of the formidable library into my code. The result? Files now take a direct path to the cloud. Let's take a peek at how the code transformation looks:

export const method3 = async (req: NextApiRequest, res: NextApiResponse) => {

// eslint-disable-next-line @typescript-eslint/ban-ts-comment
// @ts-ignore
const form = formidable({ fileWriteStreamHandler: uploadStream });
};
Enter fullscreen mode Exit fullscreen mode

if your are using a private bucket configuration and try to use the secured_url to display the image in your application, you won't be able to do so.

Here is what you need to do to solve this:

First we need to add few lines of code in gsc.ts
Go ahead and add this just below createWriteStream function:

export const getSecuredUrl = async (filename: string) => {
const now = Date.now();
const expires = new Date(now + 24 * 60 * 60 * 1000);
const url = await bucket.file(filename).getSignedUrl({
expires: expires.getTime(),
version: "v4",
action: "read",
});
return url;
};
Enter fullscreen mode Exit fullscreen mode

The function first gets the current time in milliseconds. Then, it creates a new Date object that is 24 hours in the future. This will be the expiration time for the signed URL.

Next, the function calls the bucket.file() method to get a reference to the file that we want to create a signed URL for. The file() method takes the filename as its argument.

Finally, the function calls the getSignedUrl() method to generate the signed URL. The getSignedUrl() method takes the following arguments:

  • expires: The expiration time for the signed URL in milliseconds.
  • version: The version of the Cloud Storage API to use.
  • action: The action that is allowed to be performed on the file. In this case, we are only allowing the file to be read.

The function then returns the signed URL.

Next, go and update the method3 function in upload.ts

export const method3 = async (req: NextApiRequest, res: NextApiResponse) => {

const form = formidable({ fileWriteStreamHandler: uploadStream });

try {
        form.parse(req, async (err, fields, files) => {
    const getFile = files.file as any;
    const file = Array.isArray(getFile)
    ? getFile.map((f) => f)
    : getFile.originalFilename ??
    getFile.newFilename;
    const signedUrl = await
    gcs.getSecuredUrl(file);
//add this line to avoid Content-type error in production
res.setHeader("Content-Type", "application/json");
res.status(200).json(signedUrl[0]);
}
});
} catch (error) {
    res.status(500).json(error);
}
};
Enter fullscreen mode Exit fullscreen mode

This line tries to get a signed URL for the file. If the getSecuredUrl() function returns an error, the function sends an error message back to the client. Otherwise, the function sends the signed URL back to the client.

CONCLUSION

In this article, we learned how to upload files to Google Cloud Storage Bucket using Next.js API Route. We also discussed some of the trade-offs of using different methods for storing credentials and uploading files.

Here are the key takeaways from this article:

  • Using keyFilename to store credentials is not secure, as it exposes your file to anyone that has access to the repository. Instead,we opted for env to store your credentials.
  • You can use the formidable library to directly upload files to GCS without having to create separate files for parsing the form.
  • If you are using a private bucket configuration, you need to use a signed URL to access the file in your application. You can generate a signed URL using the getSignedUrl() method from the gsc.ts file.

Feel free to drop your comments in the dedicated section below. Remember, we're all on this learning journey together, and your insights are highly valued!

Top comments (3)

Collapse
 
djibrilm profile image
djibril mugisho

Some good stuff going on here thanks in advance

Collapse
 
salehbal profile image
SalehBal

This "guide" is very limited. Can you at least provide full code ?

Collapse
 
derick1530 profile image
Derick Zihalirwa

I have noticed that the article i used for reference is no longer available. I will update the post + Add a git repo to the full code