JavaScript Development Space

Mastering Node.js Streams: The Ultimate Guide to Memory-Efficient File Processing

In Node.js, it's common to use fs.readFile and fs.writeFile for reading and writing files. While these methods are simple and convenient, they have serious limitations when working with large files or needing more granular control.

For example:

  • Memory issues: These methods load the entire file into memory at once, which is inefficient and risky for large files.
  • No pause/resume control: You can't stop reading midway to do something else and then resume.
  • No streaming: You can’t start reading from a specific position or in small chunks

To solve these issues, Node.js provides the stream module—a powerful abstraction for reading and writing data in chunks, rather than all at once.

Let’s explore how to use streams effectively.

Node.js Streams

Stream Types in Node.js

Stream TypeNamePurposeUse Cases
ReadableReadable StreamsRead data from a source in chunksReading large files, network responses
WritableWritable StreamsWrite data to a target in chunksWriting logs, large exports
DuplexDuplex StreamsRead and write at the same timeNetwork sockets, proxies
TransformTransform StreamsModify data as it's read/writtenCompressing, encrypting, transforming

Reading Files with a Stream

To read files efficiently, use fs.createReadStream. It lets you read data in chunks, specify start/end byte positions, and control the buffer size via highWaterMark.

Example: Reading Specific Bytes in Chunks

js
1 import fs from 'node:fs';
2
3 let fileContent = '';
4
5 const fileReader = fs.createReadStream('./assets/users.json', {
6 start: 5,
7 end: 20,
8 highWaterMark: 10, // Read 10 bytes per chunk
9 });
10
11 fileReader.on('open', () => console.log('File opened'));
12
13 fileReader.on('data', (chunk) => {
14 fileContent += chunk;
15 });
16
17 fileReader.on('end', () => {
18 console.log('Finished reading file:');
19 console.log(fileContent);
20 });
21
22 fileReader.on('close', () => console.log('File closed'));
23
24 fileReader.on('error', (err) => console.error('Read error:', err));

Example: Pause and Resume File Reading

js
1 import fs from 'node:fs';
2
3 const reader = fs.createReadStream('./assets/users.json', {
4 start: 0,
5 end: 50,
6 highWaterMark: 15,
7 });
8
9 reader.on('data', (chunk) => {
10 console.log('Chunk:', chunk.toString());
11 reader.pause();
12 console.log('Pausing for 1.5 seconds...');
13 setTimeout(() => {
14 reader.resume();
15 }, 1500);
16 });

You can manually stop a stream mid-read using reader.destroy().

Writing Files with a Stream

Use fs.createWriteStream to write large amounts of data efficiently.

Example: Writing to a File in Chunks

js
1 import fs from 'node:fs';
2
3 const fileWriter = fs.createWriteStream('./output/log.txt');
4
5 fileWriter.write('Event Start\n');
6 fileWriter.write('Processing...\n');
7 fileWriter.write('Event Complete\n');
8
9 fileWriter.close();
10
11 fileWriter.on('open', (fd) => {
12 console.log('Write stream opened, FD:', fd);
13 });
14
15 fileWriter.on('finish', () => {
16 console.log('Write completed.');
17 });
18
19 fileWriter.on('close', () => {
20 console.log('Write stream closed.');
21 });
22
23 fileWriter.on('error', (err) => {
24 console.error('Write error:', err);
25 });

close() vs end()

js
1 // These two are equivalent:
2 writer.end('Final message\n');
3 // is the same as:
4 writer.write('Final message\n');
5 writer.end();

Writing to a Specific Position

js
1 import fs from 'node:fs';
2
3 const writer = fs.createWriteStream('./output/offset.txt', {
4 flags: 'w',
5 start: 10,
6 });
7
8 writer.write('OffsetWrite');
9 writer.end();

Copying Files with pipe()

The pipe() method connects a readable stream to a writable one—ideal for file copying.

Method 1: Using pipe()

js
1 import fs from 'node:fs';
2
3 const source = fs.createReadStream('./data/source.txt');
4 const destination = fs.createWriteStream('./data/destination.txt');
5
6 source.pipe(destination);
7
8 source.on('end', () => {
9 console.log('File copy completed.');
10 });

Method 2: Manual Pipe

js
1 import fs from 'node:fs';
2
3 const reader = fs.createReadStream('./data/source.txt');
4 const writer = fs.createWriteStream('./data/destination.txt');
5
6 reader.on('data', (chunk) => {
7 writer.write(chunk);
8 });
9
10 reader.on('end', () => {
11 console.log('File copy done.');
12 });

Conclusion

Node.js streams are essential for handling large-scale file processing with precision and performance. Whether you're building a log system, video processor, or file transfer tool, mastering streams gives you memory efficiency, speed, and granular control.

Start simple—read and write text files with chunks. Then explore advanced use cases like Transform streams for compression or encryption.

JavaScript Development Space

JSDev Space – Your go-to hub for JavaScript development. Explore expert guides, best practices, and the latest trends in web development, React, Node.js, and more. Stay ahead with cutting-edge tutorials, tools, and insights for modern JS developers. 🚀

Join our growing community of developers! Follow us on social media for updates, coding tips, and exclusive content. Stay connected and level up your JavaScript skills with us! 🔥

© 2025 JavaScript Development Space - Master JS and NodeJS. All rights reserved.