Build a simple file watcher👀 without external dependencies or packages from NPM
Let’s get started by developing a couple of simple programs that watch files
for changes and read arguments from the command line. Even though they’re
simple toy program, these applications offer insights into Node.js’s event-based architecture.
The Power of asynchronous programming
The power of asynchronous coding in Node.js. Taking action whenever a file changes is just plain useful in a number of cases, ranging from automated deployments to running unit tests.
Our File Watcher 👀 in Action
**const** fs = require('fs');
**const** filename = process.argv[2];
if(!filename) {
throw Error('A file must specified');
}
fs.watch(filename, () **=>** console.log(`File ${filename} changed!` ));
console.log(`Now Watching Your Awesome ${filename} for change `);
The Great Event Loop âš¡
To run the program, Node.js does the following:
• It loads the script, running all the way through to the last line, which
produces the Now watching message in the console.
• It sees that there’s more to do because of the call to fs.watch().
• It waits for something to happen — namely, for the fs module to observe a
change to the file.
• It executes our callback function when the change is detected.
• It determines that the program still has not finished, and resumes waiting.
The Node.js Process
Any unhandled exception thrown in Node.js will halt the process. The exception output shows the offending file and the line number and position of the exception. It’s pretty common (maybe or mayn't be🙄 ) in Node.js development to spawn separate processes as a way of breaking up work, rather than putting everything into one big Node.js program. let’s spawn a process in Node.
Spawning a Child Process
'use strict'
**const** fs = require('fs');
**const** spawn = require('child_process').spawn;
**const** filename = process.argv[2];
if(!filename) {
throw Error('A file must specified');
}
fs.watch(filename ,() **=>** {
**const** ls = spawn('ls', ['-l', '-h', filename]);
ls.stdout.pipe(process.stdout);
});
console.log(`Now Watching Your Awesome ${filename} for change `);
The object returned by spawn() is a Child Process. Its stdin , stdout , and stderr properties are Streams that can be used to read or write data. We want to send the standard output from the child process directly to our own standard output stream. This is what the pipe() method does.
The New Child Process In Action
Capture The Data From Stream or Event Emitter
EventEmitter is a very important class in Node.js. It provides a channel for events to be dispatched and listeners to be notified. Many objects you’ll encounter in Node.js inherit from EventEmitter , like the Streams we saw in the last section.
'use strict'
**const** fs = require('fs');
**const** spawn = require('child_process').spawn;
**const** filename = process.argv[2];
if(!filename) {
throw Error('A file must specified');
}
fs.watch(filename ,() **=>** {
**const** ls = spawn('ls', ['-l', '-h', filename]);
**let** outpt = '';
_// Added event listener like 'data', 'close'_
ls.stdout.on('data', chunk **=>** outpt += chunk);
ls.on('close', () **=>** {
**const** parts = outpt.split(/|s+/);
console.log(parts[0],parts[4],parts[8]);
});
ls.stdout.pipe(process.stdout);
});
console.log(`Now Watching Your Awesome ${filename} for change `);
The on() method adds a listener for the specified event type. We listen for data events because we’re interested in data coming out of the stream.
A Buffer is Node.js’s way of representing binary data.It points to a blob of memory allocated by Node.js’s native core, outside of the JavaScript engine.
Any time you add a non-string to a string in JavaScript (like we’re doing here
with chunk), the runtime will implicitly call the object’s toString() method.
For a Buffer, this means copying the content into Node.js’s heap using the default encoding (UTF-8).
Like Stream, the ChildProcess class extends EventEmitter, so we can add listeners to it, as well.
After a child process has exited and all its streams have been flushed, it emits a close event.
Reading and Writing Files In Node.js Asynchronously
we wrote a series of Node.js programs that could watch files for changes. Now let’s explore Node.js’s methods for reading and writing files. Along the way we’ll see two common error-handling patterns in Node.js error events on EventEmitters and err callback arguments.
There are a few approaches to reading and writing files in Node. The simplest is to read in or write out the entire file at once. This technique works well for small files. Other approaches read and write by creating Streams or staging content in a Buffer
'use strict';
**const** fs = require('fs');
fs.readFile('target.txt', (err, data) **=>** {
if (err) {
throw err;
}
console.log(data.toString());
});
Notice how the first parameter to the readFile() callback handler is err. If readFile() is successful, then err will be null. Otherwise the err parameter will contain an Error object. This is a common error-reporting pattern in Node.js, especially for built-in modules.
'use strict';
**const** fs = require('fs');
fs.writeFile('target.txt', 'This is file conntent', (err) **=>** {
if (err) {
throw err;
}
console.log('File saved!');
});
This program writes This is file content to target.txt (creating it if it doesn’t exist, or overwriting it if it does). If for any reason the file can’t be written, then the err parameter will contain an Error object.
Creating Read and Write Streams
You create a read stream or a write stream by using fs.createReadStream() and fs.createWriteStream(), respectively.
Rebuild Cat program in node
_#!/usr/bin/env node_
'use strict'
require('fs').createReadStream(process.argv[2]).pipe(process.stdout)
Because the first line starts with #!, you can execute this program directly in Unix-like systems. Use chmod to make it executable:
$ chmod +x cat.js
Then, to run it, send the name of the chosen file as an additional argument:
$ ./cat.js target.txt
We have learned how to watch files for changes and to read and write files. we also learned how to spawn child processes and access command-line arguments. Node.js also support synchronous file access but it’s better not to block event-loop unless you know what you are doing.
Thank you. Happy Reading 😀😀
Top comments (0)