Using the data event, we can process the file a small chunk at a time instead, without using a lot of memory. For example, we may wish to count the number of bytes in a file.
Let's create a new folder called infinite-read with a index.js file.
Assuming we are using a Unix-like machine (including macOs and Linux), we can try to tweak this example to count the number of bytes in /dev/urandom. This is an infinite file that contains random data.
Let's write the following into index.js:
const rs = fs.createReadStream('/dev/urandom')
var size = 0
rs.on('data', (data) => {
size += data.length
console.log('File size:', size)
})
Now we can run our program:
$ node index.js
Notice that the program does not crash, even though the file is infinite. It just keeps counting bytes!
Scalability is one of the best features about streams in general, as most of the programs written using streams will scale well with any input size.