Mastering Node.js Streams: The Ultimate Guide to Memory-Efficient File Processing
In Node.js, it's common to use fs.readFile
and fs.writeFile
for reading and writing files. While these methods are simple and convenient, they have serious limitations when working with large files or needing more granular control.
For example:
- Memory issues: These methods load the entire file into memory at once, which is inefficient and risky for large files.
- No pause/resume control: You can't stop reading midway to do something else and then resume.
- No streaming: You can’t start reading from a specific position or in small chunks
To solve these issues, Node.js provides the stream
module—a powerful abstraction for reading and writing data in chunks, rather than all at once.
Let’s explore how to use streams effectively.

Stream Types in Node.js
Stream Type | Name | Purpose | Use Cases |
---|---|---|---|
Readable | Readable Streams | Read data from a source in chunks | Reading large files, network responses |
Writable | Writable Streams | Write data to a target in chunks | Writing logs, large exports |
Duplex | Duplex Streams | Read and write at the same time | Network sockets, proxies |
Transform | Transform Streams | Modify data as it's read/written | Compressing, encrypting, transforming |
Reading Files with a Stream
To read files efficiently, use fs.createReadStream
. It lets you read data in chunks, specify start/end byte positions, and control the buffer size via highWaterMark.
Example: Reading Specific Bytes in Chunks
1 import fs from 'node:fs';23 let fileContent = '';45 const fileReader = fs.createReadStream('./assets/users.json', {6 start: 5,7 end: 20,8 highWaterMark: 10, // Read 10 bytes per chunk9 });1011 fileReader.on('open', () => console.log('File opened'));1213 fileReader.on('data', (chunk) => {14 fileContent += chunk;15 });1617 fileReader.on('end', () => {18 console.log('Finished reading file:');19 console.log(fileContent);20 });2122 fileReader.on('close', () => console.log('File closed'));2324 fileReader.on('error', (err) => console.error('Read error:', err));
Example: Pause and Resume File Reading
1 import fs from 'node:fs';23 const reader = fs.createReadStream('./assets/users.json', {4 start: 0,5 end: 50,6 highWaterMark: 15,7 });89 reader.on('data', (chunk) => {10 console.log('Chunk:', chunk.toString());11 reader.pause();12 console.log('Pausing for 1.5 seconds...');13 setTimeout(() => {14 reader.resume();15 }, 1500);16 });
You can manually stop a stream mid-read using reader.destroy()
.
Writing Files with a Stream
Use fs.createWriteStream
to write large amounts of data efficiently.
Example: Writing to a File in Chunks
1 import fs from 'node:fs';23 const fileWriter = fs.createWriteStream('./output/log.txt');45 fileWriter.write('Event Start\n');6 fileWriter.write('Processing...\n');7 fileWriter.write('Event Complete\n');89 fileWriter.close();1011 fileWriter.on('open', (fd) => {12 console.log('Write stream opened, FD:', fd);13 });1415 fileWriter.on('finish', () => {16 console.log('Write completed.');17 });1819 fileWriter.on('close', () => {20 console.log('Write stream closed.');21 });2223 fileWriter.on('error', (err) => {24 console.error('Write error:', err);25 });
close()
vs end()
1 // These two are equivalent:2 writer.end('Final message\n');3 // is the same as:4 writer.write('Final message\n');5 writer.end();
Writing to a Specific Position
1 import fs from 'node:fs';23 const writer = fs.createWriteStream('./output/offset.txt', {4 flags: 'w',5 start: 10,6 });78 writer.write('OffsetWrite');9 writer.end();
Copying Files with pipe()
The pipe()
method connects a readable stream to a writable one—ideal for file copying.
Method 1: Using pipe()
1 import fs from 'node:fs';23 const source = fs.createReadStream('./data/source.txt');4 const destination = fs.createWriteStream('./data/destination.txt');56 source.pipe(destination);78 source.on('end', () => {9 console.log('File copy completed.');10 });
Method 2: Manual Pipe
1 import fs from 'node:fs';23 const reader = fs.createReadStream('./data/source.txt');4 const writer = fs.createWriteStream('./data/destination.txt');56 reader.on('data', (chunk) => {7 writer.write(chunk);8 });910 reader.on('end', () => {11 console.log('File copy done.');12 });
Conclusion
Node.js streams are essential for handling large-scale file processing with precision and performance. Whether you're building a log system, video processor, or file transfer tool, mastering streams gives you memory efficiency, speed, and granular control.
Start simple—read and write text files with chunks. Then explore advanced use cases like Transform
streams for compression or encryption.