Streams
This processes incoming data in smaller, more manageable chunks. A stream can be: readable: from a file, a HTTP request, a TCP socket, stdin, etc. writable: to a file, a HTTP response, TCP socket, stdout, etc. duplex: a stream that’s both readable and writable transform: a duplex stream that transforms data
Each chunk of data is returned as a Buffer object, which represents a fixed- length sequence of bytes. You may need to convert this to a string or another appropriate type for processing.
The application initiates file read and write streams and instantiates a new compress object:
// process stream
const
readStream = createReadStream(input),
writeStream = createWriteStream(output),
compress = new Compress();
console.log(`processing ${ input }`);
The incoming file read stream has .pipe() methods defined, which feed the incoming data through a series of functions that may (or may not) alter the contents. The data is piped through the compress transform before that output is piped to the writeable file. A final on(‘finish’) event handler function executes once the stream has ended:
readStream.pipe(compress).pipe(writeStream).on('finish', () => {
console.log(`file size
${ compress.lengthOrig }`);
console.log(`output
${ output }`);
console.log(`chunks
${ compress.chunks }`);
console.log(`file size ${ compress.lengthNew } - saved ${ Math.round((
compress.lengthOrig - compress.lengthNew) / compress.lengthOrig * 100) }%`);
});
Run the project code with an example HTML file of any size:
node filestream.js ./test/example.html ./test/output.html
A chunk could be any size and split the incoming data in inconvenient ways.