DEV Community

loading...

Node.js: Unzip Async Await

daviddalbusco profile image David Dal Busco Originally published at daviddalbusco.Medium ・2 min read

Photo by Florian Steciuk on Unsplash


I am developing a new feature of DeckDeckGo for which I have to unzip a data in Firebase Functions.

It took more time than expected to code such a Node.js function, that’s why I am sharing this solution, hoping it might help you some day too 😇.


Unzipper

Node.js provides a compression module Zlib but, it does not support ZIP files. Luckily, we can use the library unzipper to handle these.

npm i unzipper --save
Enter fullscreen mode Exit fullscreen mode

Unzip With Async Await

My new feature reads and writes data uploaded in Firebase Storage through streams. I also develop my code with a promises (async / await) approach. Therefore, both have to coexist.

To narrow down the following example, I replaced the cloud storage with local files handled with file system streams (fs ).

The function unzip instantiates a stream on the zip data which is piped with unzipper . Each entry are iterated and piped themselves to writable outputs. Summarized: the zip is opened and each files it contains are extracted.

unzip is called in a retro compatible top await function and, that’s basically it 🥳.

const {Parse} = require('unzipper');
const {createWriteStream, createReadStream} = require('fs');

const unzip = () => {
  const stream = 
    createReadStream('/Users/david/data.zip').pipe(Parse());

  return new Promise((resolve, reject) => {
    stream.on('entry', (entry) => {
      const writeStream = 
        createWriteStream(`/Users/david/${entry.path}`);
      return entry.pipe(writeStream);
    });
    stream.on('finish', () => resolve());
    stream.on('error', (error) => reject(error));
  });
};

(async () => {
  try {
    await unzip();
  } catch (err) {
    console.error(err);
  }
})();
Enter fullscreen mode Exit fullscreen mode

Read To String With Async Await

I had to read files with streams too. Consequently and cherry on top, here is how I integrated these in my code.

const {createReadStream} = require('fs');

const read = () => {
  const stream = 
    createReadStream('/Users/david/meta.json');

  return new Promise((resolve, reject) => {
    let data = '';

    stream.on('data', (chunk) => (data += chunk));
    stream.on('end', () => resolve(data));
    stream.on('error', (error) => reject(error));
  });
};

(async () => {
  try {
    const meta = await read();

    console.log({meta});
  } catch (err) {
    console.error(err);
  }
})();
Enter fullscreen mode Exit fullscreen mode

It follows the same approach as previously and read the file content to an in memory string.


Summary

Coding is like a box of chocolates. You never know what you’re gonna get. Sometimes it should be quick, it takes time. Sometimes it should take so much time but, it goes fast — For-dev-rest Gump

To infinity and beyond!

David


You can reach me on Twitter or my website.

Give a try to DeckDeckGo for your next slides!

DeckDeckGo

Discussion (4)

pic
Editor guide
Collapse
rolandcsibrei profile image
Roland Csibrei

Hello! Thanks a lot for sharing! I think all of us will need to zip/unzip data sooner or later. I am working on a PWA which needs to poll data from an API every 5 minutes. When I started to work on the application, the payload was around 20 kB. Nowadays it is around 400 kB and mobile data users started to complain. I can't change the API so I created an AWS lambda, which polls the data and I used your zip function to zip the data and store in a S3 bucket. The PWA now polls the data from this bucket. The zipped file is around 13 kB. And again, unzip with your function. I was already thinking about stripping or packing the payload so your article saved me time. I owe you a drink dude 😂 thanks again and keep posting! R.

Collapse
daviddalbusco profile image
David Dal Busco Author

Wow that's so cool to hear this story!!! Thanks a lot for the feedback, happy to hear it was useful 🥳

Collapse
tugrul profile image
Tuğrul Topuz

You can use stream.finished instead of wrapping with Promise.

nodejs.org/api/stream.html#stream_...

Collapse
daviddalbusco profile image
David Dal Busco Author

Wrapping with Promise was my goal 😉 That being said, pretty cool! Thx for pointing stream.finished out, good one 👍