DEV Community πŸ‘©β€πŸ’»πŸ‘¨β€πŸ’»

DEV Community πŸ‘©β€πŸ’»πŸ‘¨β€πŸ’» is a community of 968,873 amazing developers

We're a place where coders share, stay up-to-date and grow their careers.

Create account Log in
kamran
kamran

Posted on

Implementing β€œTail -f” in Node JS

Image description
Linux is one of the most beautiful thing developed till date and sometime i wonder how a particular command in linux works under the hood, like how β€œls” exactly works behind the scenes. So I tried to replicate one of the most used linux commands, β€œTail -f” in nodejs.
For those who don't know, β€œTail -f” prints the last 10 lines from a file and then it monitors the updates in the file and prints the content when the file is updated. Now node js has an inbuilt filesystem module which helps us to play with files and folders and there are some direct methods available to read a file, keep a monitor on a file, write a file. So it sounds easy to take these commands and use them one after another but it isn't that easy.
Things which we have to handle:
We should get data on every update of file
We don't have read the full file every time file is updated
So I started going through all methods available in the fs package in nodejs. I got readline, which reads a file line by line and on each line it emits the lie data in an event

// Read file line by line
var readline = require('readline');
var rl = readline.createInterface(process.stdin, process.stdout);
rl.on('line', (log) => {
    console.log(`Log is: ${log}`);
    rl.close();
});
Enter fullscreen mode Exit fullscreen mode

This method looked very helpful for my case so I started with it, but the thing is now i have to decide when i have to call this method. I have to keep an eye on the file for updates, so I looked for any other package which can detect changes in the file.
I got fs.watchfile which emits an event whenever there is any change in file. I tried out the method but noticed that this method emits events on metadata updates so I have to add some filter. We got current and previous stats on each event. So now i have compare the size of the file to see if actual data is updated in file or not and do stuff only if file size is changed

// Keep a watch on file value change
fs.watchFile(filename, (curr, prev) => {
    console.log("Previous Stats", prev);
    console.log("Current Stats", curr);
    if (curr.size.valueOf() === prev.size.valueOf())
        return;
});
Enter fullscreen mode Exit fullscreen mode

Just to be double sure, i have to store the current size for future ref and then compare it with the next size. Now I can monitor an update so on each update I call the readline method.
Now I had one more big issue, that on each event I was reading and transferring the full file. This had a simple solution that I keep a pointer and move it to the last line and when I read the file again I only print the line after I reach the last pointer.

// Read file line by line
let pointer = 0;
// Keep a watch on file value change
fs.watchFile(filename, (curr, prev) => {
    if (curr.size.valueOf() === prev.size.valueOf())
        return;

    let currentpointer = 0;
    var readline = require('readline');
    var rl = readline.createInterface(process.stdin, process.stdout);
    rl.on('line', (log) => {
        currentpointer++
        if (currentpointer > pointer)
            console.log(`Log is: ${log}`);
        rl.close();
    });
    pointer = currentpointer;
});
Enter fullscreen mode Exit fullscreen mode

This was working as per need but there was still a problem, it was inefficient as even though i was not printing the log line every time an event occurred but i was going through each line and that was time and memory consuming.
So I started looking for a readline alternative, which can read from a specific line from a file. I got a simple β€œread” method in which I can pass at what part I can read but I have to pass the starting byte not the line. So i used this, now instead of reading lines i was reading buffer and changed my pointer from line to buffer byte.
Now I have to open the file and read it from the last byte which I read in the last event. Now I had a buffer instead of line data, so I converted it to a normal string from the buffer, then splitted the string by β€œ\n” which is a newline and then printed the array elements one by one.

 // Keep a watch on file value change
 fs.watchFile(filename, (curr, prev) => {
    if (filename) {
        //Check if file is actually updated
        if (curr.size.valueOf() === previousFileSize.valueOf())
            return;

        let buffer = new Buffer.alloc(curr.size - lastReadByte + 1);

        previousFileSize = curr.size;
        console.log(`${filename} file Changed`);

        fs.open(filename, fileOpenMode, (err, filedata) => {
            if (err)
                return console.error(err);

            console.log("Reading the file");
            fs.read(filedata, buffer, 0, buffer.length, lastReadByte, (err, bytes) => {
                if (err)
                    return console.error(err);

                if (bytes > 0) {
                    const dataString = buffer.slice(0, bytes).toString();
                    const dataArray = dataString.split("\n");
                    dataArray.forEach(logline => {
                        if (logline)
                            console.log(logline)
                    });
                }
                lastReadByte = stats.size
                // Close the opened file.
                fs.close(filedata, (err) => {
                    if (err)
                        return console.error(err);
                    console.log("File closed successfully");
                });
            });
        });
    }
});
Enter fullscreen mode Exit fullscreen mode

So this is an efficient way to tail a continuously updating file implemented in Node JS. Happy Koding!! For more content, you can subscribe my YT channel

Latest comments (0)

"I made 10x faster JSON.stringify() functions, even type safe"

☝️ Must read for JS devs