Streams pass data in parts, so the app stays fast. This article explains Node.js Streams and how to use them.
Table of Content
Node.js Streams handle data in small parts. This method allows an app to work with files or network data without loading all at once.
Understand the Streams in Node.js
Node.js Streams read or write data in parts. They help apps use less memory and keep high speed.
Streams save memory and speed up data transfer. They allow code to process data as it arrives.
You can create a stream with the built-in fs module. You can also use streams to send data from one place to another.
Here is a list showing you the types of streams in Node.js:
- Readable
- Writable
- Duplex
- Transform
We will cover each type in depth in the following section.
Types of Streams in Node.js
Readable Streams:
A readable stream allows data to flow from a source. You can read a file without loading it all at once.
const fs = require('fs');
const readable = fs.createReadStream('file.txt');
readable.on('data', chunk => {
console.log(chunk.toString());
});This code opens a file, reads it part by part, and prints each part to the console.
Writable Streams:
A writable stream sends data to a target. You can write a file without loading it all at once.
const fs = require('fs');
const writable = fs.createWriteStream('file.txt');
writable.write('This is a line\n');
writable.end();This code opens a file and writes a new line, then ends the stream.
Duplex Streams:
A duplex stream can read and write at the same time. This type allows two-way data flow.
const { Duplex } = require('stream');
const duplex = new Duplex({
read(size) {},
write(chunk, encoding, callback) {
console.log(chunk.toString());
callback();
}
});
duplex.write('Hello World');This code creates a custom duplex stream and writes data to it while also having read support.
Transform Streams:
A transform stream changes the data as it flows. It reads data, changes it, then writes it out.
const { Transform } = require('stream');
const upper = new Transform({
transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
});
upper.write('hello');
upper.on('data', data => console.log(data.toString()));This code takes text input and sends it out in upper case.
Piping Streams in Node.js
You can link one stream to another. This action is called piping. It allows data to flow from a source to a target in one step.
const fs = require('fs');
const readable = fs.createReadStream('input.txt');
const writable = fs.createWriteStream('output.txt');
readable.pipe(writable);This code copies data from one file to another by piping it through streams.
How to Handle Errors in Streams
Streams may fail due to file issues or network issues. You should always watch for errors. Use the error event to catch problems.
const fs = require('fs');
const readable = fs.createReadStream('file.txt');
readable.on('error', err => {
console.error('Error happened:', err.message);
});This code listens for an error event and prints the message to the console.
Examples
Read a Large File:
const fs = require('fs');
const readable = fs.createReadStream('big.txt');
readable.on('data', chunk => {
console.log('Received', chunk.length, 'bytes');
});This example shows how to read a large file in small parts and log the size of each part. It helps save memory and speeds up file reading.
Write Data to a File with a Stream:
const fs = require('fs');
const writable = fs.createWriteStream('log.txt');
for (let i = 0; i < 5; i++) {
writable.write(`Line ${i}\n`);
}
writable.end();This example writes several lines to a file without loading them all in memory. It creates a smooth data flow to the file.
Transform Data Before Writing:
const fs = require('fs');
const { Transform } = require('stream');
const upper = new Transform({
transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
});
const readable = fs.createReadStream('input.txt');
const writable = fs.createWriteStream('upper.txt');
readable.pipe(upper).pipe(writable);This example reads data from one file, converts it to upper case, and writes it to another file in a single flow. It uses transform stream to change the data.
Duplex Stream for Custom Data Flow:
const { Duplex } = require('stream');
const duplex = new Duplex({
read(size) {},
write(chunk, encoding, callback) {
console.log('Write:', chunk.toString());
callback();
}
});
duplex.write('Test');This example shows a custom duplex stream that accepts data for write and can also read data. It demonstrates two way flow in a stream.
Wrapping Up
You learned what Node.js Streams are and how to use them. You also saw types of streams and error handling.
Here is a quick recap:
- Streams read or write data in parts, streams can connect to each other with pipes, and streams can change data during flow.
FAQs
What are Node JS Streams and why are they used?
streams allow handling large amounts of data
piece by piece instead of loading it all at once.
They are used for efficiency and real-time processing.
- Readable Streams: Read data (e.g., fs.createReadStream)
- Writable Streams: Write data (e.g., fs.createWriteStream)
- Duplex Streams: Both read and write (e.g., TCP sockets)
- Transform Streams: Modify data while reading/writing
How do you create a readable stream in Node JS?
readable stream using the
fs module.
Example:
const fs = require('fs');
const stream = fs.createReadStream('example.txt', 'utf8');
stream.on('data', (chunk) => {
console.log('Chunk received:', chunk);
});
stream.on('end', () => {
console.log('No more data.');
});
What is the difference between streams and buffers?
- Buffers: Store entire data in memory before processing.
- Streams: Process data piece by piece as it arrives.
streams more memory-efficient and faster
for large files or continuous data flow.
How do you pipe streams in Node JS?
pipe() method connects a readable stream
to a writable stream.
Example:
const fs = require('fs');
const reader = fs.createReadStream('input.txt');
const writer = fs.createWriteStream('output.txt');
reader.pipe(writer);
This will copy input.txt content into output.txt.Similar Reads
This guide shows Node.js modules, how they work, how to import and export them, and why you use them. What…
What is Nodejs? Node.js is a powerful cross-platform JavaScript runtime that enables developers to execute JavaScript code on the server-side.…
NVM (Node Version Manager ) is a command-line tool used to manage multiple versions of Node.js on a single machine.…
Node Package Manager (NPM) is an essential tool at the heart of the Node.js ecosystem. This is the defacto package…
Node JS runs code in an event-based model and uses events to drive tasks. An event is a signal that…
Node.js file system lets you read, write, update, and manage files and folders with built-in methods. Understand the Node.js File…
In this quick tutorial, you will start Node.js tutorials with a Hello World program. You will learn how to set…
This article shows how Node.js creates HTTP servers. It covers requests, responses, routes, HTML, JSON, and gives examples for both…
Node JS Buffers store raw binary data in memory. This guide shows how to create them, work with them, and…
Node.js REPL runs JavaScript code in an instant test space inside your terminal. What is Node.js REPL? Node.js REPL is…