DEV Community

Cover image for Node.js Streams Quick Introduction
Renato Pozzi
Renato Pozzi

Posted on • Updated on

Node.js Streams Quick Introduction

Curious to try the streams but you don't know if they can serve you? Let's see its uses together!

What are Streams?

In an event-based platform such as Node.js, the most efficient way to handle I/O is in real-time, consuming the input as soon as it is available and sending the output as soon as the application produces it.

Streams allow us to process the data as soon as it arrives from the resource. This is different from buffering, where all the data that comes from a resource needs to be collected into a buffer until the operation is completed.

What are the advantages of Streams?

Stream has some advantages which can be useful in some use cases:

  • Spacial Efficiency
    You have a super-duper big file to process, with streams you can do it without going out of memory because streams keep going with constant memory utilization.

  • Time Efficiency
    During a multi-step operation, like compressing a file and after, store it, with streams you don't need to wait that the compression has been finished to start with the store operation, every chunk is compressed and stored as soon as is available.

  • Composability
    With the pipe command, you can compose all your streams operation to have functions each responsible for one single functionality.

How Streams are made?

Stream in Node.js is the implementation of one of these classes:

  • Readable
  • Writable
  • Duplex
  • Transform

Every stream is also an instance of EventEmitter, which is why it is possible to concatenate the on function of the classic events in Node.

Streams also support two operating modes:

  • Binary Mode
  • Object Mode

Let's see an example in Binary Mode:

// Our File words.txt
$ cat words.txt 
This
is
an
article
from
dev.to!

// Our code
const fs = require('fs');

const stream = fs.createReadStream('words.txt');

stream.on('data', (chunk) => console.log(chunk.toString()));
stream.on('end', () => console.log('Stream is ended! :)'))
Enter fullscreen mode Exit fullscreen mode

And also in Object Mode:

const { Readable } = require("stream");

const people = [
  { name: "Renato", surname: "Pozzi", twitter: "@itsrennyman" },
  { name: "Bill", surname: "Gates", twitter: "@BillGates" },
  { name: "Tim", surname: "Cook", twitter: "@tim_cook" },
];

const stream = Readable.from(people);

stream.on("data", (person) =>
  console.log(
    `Hey! This is ${person.name} and this is my Twitter: ${person.twitter}`
  )
);
Enter fullscreen mode Exit fullscreen mode

As you can see, you can use streams also to process very big arrays.

In the next article, we will go into the various sub-branches of the streams in order to discover them all!

Thank you for your reading!

Top comments (0)