1. Piping input to output stream (file examples)
const fs = require('fs');
let input = fs.createReadStream('/var/www/examples/test.txt');
let output = fs.createWriteStream('/tmp/out.txt');
input.pipe(output);
-
require('fs')
- library to work with file system, -
fs.createReadStream(
- create stream to read data from (file in our case), -
fs.createWriteStream
- open file and create writing stream from it, -
/var/www/examples/test.txt
- path to file to stream read, -
/tmp/out.txt
- path to file to write stream to, -
.pipe(
- pipe input stream to output (in our case - read from input file and write to output).
Open original or edit on Github.
2. Preferred way to pipeline streams
const fs = require('fs');
const { pipeline } = require('stream');
let input = fs.createReadStream('/var/www/examples/test.txt');
let output = fs.createWriteStream('/tmp/out.txt');
pipeline(input, output, (err) => console.log(err));
-
require('fs')
- library to work with file system, -
fs.createReadStream(
- create stream to read data from (file in our case), -
fs.createWriteStream
- open file and create writing stream from it, -
/var/www/examples/test.txt
- path to file to stream read, -
/tmp/out.txt
- path to file to write stream to, -
pipeline
- pipelines given streams (left to right) and properly destroys all objects after processing, -
(err) => console.log(err)
- last argument of pipeline() is always error callback.
Open original or edit on Github.
3. How to read input stream to Buffer
const fs = require('fs');
let input = fs.createReadStream('/var/www/examples/test.txt');
input.on('data', buf => {
console.log(buf);
});
-
require('fs')
- library to work with file system, -
fs.createReadStream(
- create stream to read data from (file in our case), -
/var/www/examples/test.txt
- path to file to stream read, -
.on('data'
- handle date reading from stream, -
buf
- buffer gets data chunk read from stream.
Open original or edit on Github.
4. How to read input stream to string
const fs = require('fs');
let input = fs.createReadStream('/var/www/examples/test.txt');
const chunks = [];
input.on('data', buf => chunks.push(buf));
input.on('end', () => console.log( Buffer.concat(chunks).toString() ))
-
require('fs')
- library to work with file system, -
fs.createReadStream(
- create stream to read data from (file in our case), -
/var/www/examples/test.txt
- path to file to stream read, -
.on('data'
- handle date reading from stream, -
chunks.push(buf)
- push each chunk into array, -
Buffer.concat(chunks)
- join all chunks into single buffer, -
.toString()
- convert Buffer to string, -
input.on('end'
- fire when stream reading was finished.
Open original or edit on Github.
5. How to transform streams using Transform
interface
const fs = require('fs');
const { Transform } = require("stream");
let input = fs.createReadStream('/var/www/examples/test.txt');
const my_transform = new Transform({
transform(chunk, encoding, callback) {
callback(null, 'TRANSFORMED: ' + chunk.toString());
},
});
my_transform.on('data', buf => console.log(buf.toString()));
input.pipe(my_transform);
-
require('fs')
- library to work with file system, -
fs.createReadStream(
- create stream to read data from (file in our case), -
new Transform(
- create new stream that transforms data, -
transform(chunk, encoding, callback)
- chunk transformation function, -
'TRANSFORMED: ' + chunk.toString()
- return transformed chunk to callback (we add TRANSFORMED: text to chunk as transformation example), -
.on('data'
- handle date reading from stream, -
.pipe(
- pipe object to the given stream (read from file and pipe to transforming stream).
Top comments (0)