-
-
Notifications
You must be signed in to change notification settings - Fork 35.5k
Expand file tree
/
Copy pathstream.md
More file actions
2299 lines (1841 loc) Β· 75.8 KB
/
stream.md
File metadata and controls
2299 lines (1841 loc) Β· 75.8 KB
Edit and raw actions
OlderNewer
Β
1
# Stream
2
3
<!--introduced_in=v0.10.0-->
4
5
> Stability: 2 - Stable
6
7
A stream is an abstract interface for working with streaming data in Node.js.
8
The `stream` module provides a base API that makes it easy to build objects
9
that implement the stream interface.
10
11
There are many stream objects provided by Node.js. For instance, a
12
[request to an HTTP server][http-incoming-message] and [`process.stdout`][]
13
are both stream instances.
14
15
Streams can be readable, writable, or both. All streams are instances of
16
[`EventEmitter`][].
17
18
The `stream` module can be accessed using:
19
20
```js
21
const stream = require('stream');
22
```
23
24
While it is important for all Node.js users to understand how streams work,
25
the `stream` module itself is most useful for developers that are creating new
26
types of stream instances. Developers who are primarily *consuming* stream
27
objects will rarely (if ever) have need to use the `stream` module directly.
28
29
## Organization of this Document
30
31
This document is divided into two primary sections with a third section for
32
additional notes. The first section explains the elements of the stream API that
33
are required to *use* streams within an application. The second section explains
34
the elements of the API that are required to *implement* new types of streams.
35
36
## Types of Streams
37
38
There are four fundamental stream types within Node.js:
39
40
* [Readable][] - streams from which data can be read (for example
41
[`fs.createReadStream()`][]).
42
* [Writable][] - streams to which data can be written (for example
43
[`fs.createWriteStream()`][]).
44
* [Duplex][] - streams that are both Readable and Writable (for example
45
[`net.Socket`][]).
46
* [Transform][] - Duplex streams that can modify or transform the data as it
47
is written and read (for example [`zlib.createDeflate()`][]).
48
49
### Object Mode
50
51
All streams created by Node.js APIs operate exclusively on strings and `Buffer`
52
(or `Uint8Array`) objects. It is possible, however, for stream implementations
53
to work with other types of JavaScript values (with the exception of `null`,
54
which serves a special purpose within streams). Such streams are considered to
55
operate in "object mode".
56
57
Stream instances are switched into object mode using the `objectMode` option
58
when the stream is created. Attempting to switch an existing stream into
59
object mode is not safe.
60
61
### Buffering
62
63
<!--type=misc-->
64
65
Both [Writable][] and [Readable][] streams will store data in an internal
66
buffer that can be retrieved using `writable._writableState.getBuffer()` or
67
`readable._readableState.buffer`, respectively.
68
69
The amount of data potentially buffered depends on the `highWaterMark` option
70
passed into the streams constructor. For normal streams, the `highWaterMark`
71
option specifies a [total number of bytes][hwm-gotcha]. For streams operating
72
in object mode, the `highWaterMark` specifies a total number of objects.
73
74
Data is buffered in Readable streams when the implementation calls
75
[`stream.push(chunk)`][stream-push]. If the consumer of the Stream does not
76
call [`stream.read()`][stream-read], the data will sit in the internal
77
queue until it is consumed.
78
79
Once the total size of the internal read buffer reaches the threshold specified
80
by `highWaterMark`, the stream will temporarily stop reading data from the
81
underlying resource until the data currently buffered can be consumed (that is,
82
the stream will stop calling the internal `readable._read()` method that is
83
used to fill the read buffer).
84
85
Data is buffered in Writable streams when the
86
[`writable.write(chunk)`][stream-write] method is called repeatedly. While the
87
total size of the internal write buffer is below the threshold set by
88
`highWaterMark`, calls to `writable.write()` will return `true`. Once
89
the size of the internal buffer reaches or exceeds the `highWaterMark`, `false`
90
will be returned.
91
92
A key goal of the `stream` API, particularly the [`stream.pipe()`] method,
93
is to limit the buffering of data to acceptable levels such that sources and
94
destinations of differing speeds will not overwhelm the available memory.
95
96
Because [Duplex][] and [Transform][] streams are both Readable and Writable,
97
each maintain *two* separate internal buffers used for reading and writing,
98
allowing each side to operate independently of the other while maintaining an
99
appropriate and efficient flow of data. For example, [`net.Socket`][] instances
100
are [Duplex][] streams whose Readable side allows consumption of data received
101
*from* the socket and whose Writable side allows writing data *to* the socket.
102
Because data may be written to the socket at a faster or slower rate than data
103
is received, it is important for each side to operate (and buffer) independently
104
of the other.
105
106
## API for Stream Consumers
107
108
<!--type=misc-->
109
110
Almost all Node.js applications, no matter how simple, use streams in some
111
manner. The following is an example of using streams in a Node.js application
112
that implements an HTTP server:
113
114
```js
115
const http = require('http');
116
117
const server = http.createServer((req, res) => {
118
// req is an http.IncomingMessage, which is a Readable Stream
119
// res is an http.ServerResponse, which is a Writable Stream
120
121
let body = '';
122
// Get the data as utf8 strings.
123
// If an encoding is not set, Buffer objects will be received.
124
req.setEncoding('utf8');
125
126
// Readable streams emit 'data' events once a listener is added
127
req.on('data', (chunk) => {
128
body += chunk;
129
});
130
131
// the end event indicates that the entire body has been received
132
req.on('end', () => {
133
try {
134
const data = JSON.parse(body);
135
// write back something interesting to the user:
136
res.write(typeof data);
137
res.end();
138
} catch (er) {
139
// uh oh! bad json!
140
res.statusCode = 400;
141
return res.end(`error: ${er.message}`);
142
}
143
});
144
});
145
146
server.listen(1337);
147
148
// $ curl localhost:1337 -d "{}"
149
// object
150
// $ curl localhost:1337 -d "\"foo\""
151
// string
152
// $ curl localhost:1337 -d "not json"
153
// error: Unexpected token o in JSON at position 1
154
```
155
156
[Writable][] streams (such as `res` in the example) expose methods such as
157
`write()` and `end()` that are used to write data onto the stream.
158
159
[Readable][] streams use the [`EventEmitter`][] API for notifying application
160
code when data is available to be read off the stream. That available data can
161
be read from the stream in multiple ways.
162
163
Both [Writable][] and [Readable][] streams use the [`EventEmitter`][] API in
164
various ways to communicate the current state of the stream.
165
166
[Duplex][] and [Transform][] streams are both [Writable][] and [Readable][].
167
168
Applications that are either writing data to or consuming data from a stream
169
are not required to implement the stream interfaces directly and will generally
170
have no reason to call `require('stream')`.
171
172
Developers wishing to implement new types of streams should refer to the
173
section [API for Stream Implementers][].
174
175
### Writable Streams
176
177
Writable streams are an abstraction for a *destination* to which data is
178
written.
179
180
Examples of [Writable][] streams include:
181
182
* [HTTP requests, on the client][]
183
* [HTTP responses, on the server][]
184
* [fs write streams][]
185
* [zlib streams][zlib]
186
* [crypto streams][crypto]
187
* [TCP sockets][]
188
* [child process stdin][]
189
* [`process.stdout`][], [`process.stderr`][]
190
191
*Note*: Some of these examples are actually [Duplex][] streams that implement
192
the [Writable][] interface.
193
194
All [Writable][] streams implement the interface defined by the
195
`stream.Writable` class.
196
197
While specific instances of [Writable][] streams may differ in various ways,
198
all Writable streams follow the same fundamental usage pattern as illustrated
199
in the example below:
200
201
```js
202
const myStream = getWritableStreamSomehow();
203
myStream.write('some data');
204
myStream.write('some more data');
205
myStream.end('done writing data');
206
```
207
208
#### Class: stream.Writable
209
<!-- YAML
210
added: v0.9.4
211
-->
212
213
<!--type=class-->
214
215
##### Event: 'close'
216
<!-- YAML
217
added: v0.9.4
218
-->
219
220
The `'close'` event is emitted when the stream and any of its underlying
221
resources (a file descriptor, for example) have been closed. The event indicates
222
that no more events will be emitted, and no further computation will occur.
223
224
Not all Writable streams will emit the `'close'` event.
225
226
##### Event: 'drain'
227
<!-- YAML
228
added: v0.9.4
229
-->
230
231
If a call to [`stream.write(chunk)`][stream-write] returns `false`, the
232
`'drain'` event will be emitted when it is appropriate to resume writing data
233
to the stream.
234
235
```js
236
// Write the data to the supplied writable stream one million times.
237
// Be attentive to back-pressure.
238
function writeOneMillionTimes(writer, data, encoding, callback) {
239
let i = 1000000;
240
write();
241
function write() {
242
let ok = true;
243
do {
244
i--;
245
if (i === 0) {
246
// last time!
247
writer.write(data, encoding, callback);
248
} else {
249
// see if we should continue, or wait
250
// don't pass the callback, because we're not done yet.
251
ok = writer.write(data, encoding);
252
}
253
} while (i > 0 && ok);
254
if (i > 0) {
255
// had to stop early!
256
// write some more once it drains
257
writer.once('drain', write);
258
}
259
}
260
}
261
```
262
263
##### Event: 'error'
264
<!-- YAML
265
added: v0.9.4
266
-->
267
268
* {Error}
269
270
The `'error'` event is emitted if an error occurred while writing or piping
271
data. The listener callback is passed a single `Error` argument when called.
272
273
*Note*: The stream is not closed when the `'error'` event is emitted.
274
275
##### Event: 'finish'
276
<!-- YAML
277
added: v0.9.4
278
-->
279
280
The `'finish'` event is emitted after the [`stream.end()`][stream-end] method
281
has been called, and all data has been flushed to the underlying system.
282
283
```js
284
const writer = getWritableStreamSomehow();
285
for (let i = 0; i < 100; i++) {
286
writer.write(`hello, #${i}!\n`);
287
}
288
writer.end('This is the end\n');
289
writer.on('finish', () => {
290
console.error('All writes are now complete.');
291
});
292
```
293
294
##### Event: 'pipe'
295
<!-- YAML
296
added: v0.9.4
297
-->
298
299
* `src` {stream.Readable} source stream that is piping to this writable
300
301
The `'pipe'` event is emitted when the [`stream.pipe()`][] method is called on
302
a readable stream, adding this writable to its set of destinations.
303
304
```js
305
const writer = getWritableStreamSomehow();
306
const reader = getReadableStreamSomehow();
307
writer.on('pipe', (src) => {
308
console.error('something is piping into the writer');
309
assert.equal(src, reader);
310
});
311
reader.pipe(writer);
312
```
313
314
##### Event: 'unpipe'
315
<!-- YAML
316
added: v0.9.4
317
-->
318
319
* `src` {stream.Readable} The source stream that
320
[unpiped][`stream.unpipe()`] this writable
321
322
The `'unpipe'` event is emitted when the [`stream.unpipe()`][] method is called
323
on a [Readable][] stream, removing this [Writable][] from its set of
324
destinations.
325
326
```js
327
const writer = getWritableStreamSomehow();
328
const reader = getReadableStreamSomehow();
329
writer.on('unpipe', (src) => {
330
console.error('Something has stopped piping into the writer.');
331
assert.equal(src, reader);
332
});
333
reader.pipe(writer);
334
reader.unpipe(writer);
335
```
336
337
##### writable.cork()
338
<!-- YAML
339
added: v0.11.2
340
-->
341
342
The `writable.cork()` method forces all written data to be buffered in memory.
343
The buffered data will be flushed when either the [`stream.uncork()`][] or
344
[`stream.end()`][stream-end] methods are called.
345
346
The primary intent of `writable.cork()` is to avoid a situation where writing
347
many small chunks of data to a stream do not cause a backup in the internal
348
buffer that would have an adverse impact on performance. In such situations,
349
implementations that implement the `writable._writev()` method can perform
350
buffered writes in a more optimized manner.
351
352
See also: [`writable.uncork()`][].
353
354
##### writable.end([chunk][, encoding][, callback])
355
<!-- YAML
356
added: v0.9.4
357
changes:
358
- version: v8.0.0
359
pr-url: https://github.com/nodejs/node/pull/11608
360
description: The `chunk` argument can now be a `Uint8Array` instance.
361
-->
362
363
* `chunk` {string|Buffer|Uint8Array|any} Optional data to write. For streams
364
not operating in object mode, `chunk` must be a string, `Buffer` or
365
`Uint8Array`. For object mode streams, `chunk` may be any JavaScript value
366
other than `null`.
367
* `encoding` {string} The encoding, if `chunk` is a string
368
* `callback` {Function} Optional callback for when the stream is finished
369
370
Calling the `writable.end()` method signals that no more data will be written
371
to the [Writable][]. The optional `chunk` and `encoding` arguments allow one
372
final additional chunk of data to be written immediately before closing the
373
stream. If provided, the optional `callback` function is attached as a listener
374
for the [`'finish'`][] event.
375
376
Calling the [`stream.write()`][stream-write] method after calling
377
[`stream.end()`][stream-end] will raise an error.
378
379
```js
380
// write 'hello, ' and then end with 'world!'
381
const file = fs.createWriteStream('example.txt');
382
file.write('hello, ');
383
file.end('world!');
384
// writing more now is not allowed!
385
```
386
387
##### writable.setDefaultEncoding(encoding)
388
<!-- YAML
389
added: v0.11.15
390
changes:
391
- version: v6.1.0
392
pr-url: https://github.com/nodejs/node/pull/5040
393
description: This method now returns a reference to `writable`.
394
-->
395
396
* `encoding` {string} The new default encoding
397
* Returns: `this`
398
399
The `writable.setDefaultEncoding()` method sets the default `encoding` for a
400
[Writable][] stream.
401
402
##### writable.uncork()
403
<!-- YAML
404
added: v0.11.2
405
-->
406
407
The `writable.uncork()` method flushes all data buffered since
408
[`stream.cork()`][] was called.
409
410
When using [`writable.cork()`][] and `writable.uncork()` to manage the buffering
411
of writes to a stream, it is recommended that calls to `writable.uncork()` be
412
deferred using `process.nextTick()`. Doing so allows batching of all
413
`writable.write()` calls that occur within a given Node.js event loop phase.
414
415
```js
416
stream.cork();
417
stream.write('some ');
418
stream.write('data ');
419
process.nextTick(() => stream.uncork());
420
```
421
422
If the [`writable.cork()`][] method is called multiple times on a stream, the same
423
number of calls to `writable.uncork()` must be called to flush the buffered
424
data.
425
426
```js
427
stream.cork();
428
stream.write('some ');
429
stream.cork();
430
stream.write('data ');
431
process.nextTick(() => {
432
stream.uncork();
433
// The data will not be flushed until uncork() is called a second time.
434
stream.uncork();
435
});
436
```
437
438
See also: [`writable.cork()`][].
439
440
##### writable.writableHighWaterMark
441
<!-- YAML
442
added: v9.3.0
443
-->
444
445
Return the value of `highWaterMark` passed when constructing this
446
`Writable`.
447
448
##### writable.write(chunk[, encoding][, callback])
449
<!-- YAML
450
added: v0.9.4
451
changes:
452
- version: v8.0.0
453
pr-url: https://github.com/nodejs/node/pull/11608
454
description: The `chunk` argument can now be a `Uint8Array` instance.
455
- version: v6.0.0
456
pr-url: https://github.com/nodejs/node/pull/6170
457
description: Passing `null` as the `chunk` parameter will always be
458
considered invalid now, even in object mode.
459
-->
460
461
* `chunk` {string|Buffer|Uint8Array|any} Optional data to write. For streams
462
not operating in object mode, `chunk` must be a string, `Buffer` or
463
`Uint8Array`. For object mode streams, `chunk` may be any JavaScript value
464
other than `null`.
465
* `encoding` {string} The encoding, if `chunk` is a string
466
* `callback` {Function} Callback for when this chunk of data is flushed
467
* Returns: {boolean} `false` if the stream wishes for the calling code to
468
wait for the `'drain'` event to be emitted before continuing to write
469
additional data; otherwise `true`.
470
471
The `writable.write()` method writes some data to the stream, and calls the
472
supplied `callback` once the data has been fully handled. If an error
473
occurs, the `callback` *may or may not* be called with the error as its
474
first argument. To reliably detect write errors, add a listener for the
475
`'error'` event.
476
477
The return value is `true` if the internal buffer is less than the
478
`highWaterMark` configured when the stream was created after admitting `chunk`.
479
If `false` is returned, further attempts to write data to the stream should
480
stop until the [`'drain'`][] event is emitted.
481
482
While a stream is not draining, calls to `write()` will buffer `chunk`, and
483
return false. Once all currently buffered chunks are drained (accepted for
484
delivery by the operating system), the `'drain'` event will be emitted.
485
It is recommended that once write() returns false, no more chunks be written
486
until the `'drain'` event is emitted. While calling `write()` on a stream that
487
is not draining is allowed, Node.js will buffer all written chunks until
488
maximum memory usage occurs, at which point it will abort unconditionally.
489
Even before it aborts, high memory usage will cause poor garbage collector
490
performance and high RSS (which is not typically released back to the system,
491
even after the memory is no longer required). Since TCP sockets may never
492
drain if the remote peer does not read the data, writing a socket that is
493
not draining may lead to a remotely exploitable vulnerability.
494
495
Writing data while the stream is not draining is particularly
496
problematic for a [Transform][], because the `Transform` streams are paused
497
by default until they are piped or an `'data'` or `'readable'` event handler
498
is added.
499
500
If the data to be written can be generated or fetched on demand, it is
501
recommended to encapsulate the logic into a [Readable][] and use
502
[`stream.pipe()`][]. However, if calling `write()` is preferred, it is
503
possible to respect backpressure and avoid memory issues using the
504
[`'drain'`][] event:
505
506
```js
507
function write(data, cb) {
508
if (!stream.write(data)) {
509
stream.once('drain', cb);
510
} else {
511
process.nextTick(cb);
512
}
513
}
514
515
// Wait for cb to be called before doing any other write.
516
write('hello', () => {
517
console.log('write completed, do more writes now');
518
});
519
```
520
521
A Writable stream in object mode will always ignore the `encoding` argument.
522
523
##### writable.destroy([error])
524
<!-- YAML
525
added: v8.0.0
526
-->
527
528
* Returns: `this`
529
530
Destroy the stream, and emit the passed error. After this call, the
531
writable stream has ended. Implementors should not override this method,
532
but instead implement [`writable._destroy`][writable-_destroy].
533
534
### Readable Streams
535
536
Readable streams are an abstraction for a *source* from which data is
537
consumed.
538
539
Examples of Readable streams include:
540
541
* [HTTP responses, on the client][http-incoming-message]
542
* [HTTP requests, on the server][http-incoming-message]
543
* [fs read streams][]
544
* [zlib streams][zlib]
545
* [crypto streams][crypto]
546
* [TCP sockets][]
547
* [child process stdout and stderr][]
548
* [`process.stdin`][]
549
550
All [Readable][] streams implement the interface defined by the
551
`stream.Readable` class.
552
553
#### Two Modes
554
555
Readable streams effectively operate in one of two modes: flowing and paused.
556
557
When in flowing mode, data is read from the underlying system automatically
558
and provided to an application as quickly as possible using events via the
559
[`EventEmitter`][] interface.
560
561
In paused mode, the [`stream.read()`][stream-read] method must be called
562
explicitly to read chunks of data from the stream.
563
564
All [Readable][] streams begin in paused mode but can be switched to flowing
565
mode in one of the following ways:
566
567
* Adding a [`'data'`][] event handler.
568
* Calling the [`stream.resume()`][stream-resume] method.
569
* Calling the [`stream.pipe()`][] method to send the data to a [Writable][].
570
571
The Readable can switch back to paused mode using one of the following:
572
573
* If there are no pipe destinations, by calling the
574
[`stream.pause()`][stream-pause] method.
575
* If there are pipe destinations, by removing any [`'data'`][] event
576
handlers, and removing all pipe destinations by calling the
577
[`stream.unpipe()`][] method.
578
579
The important concept to remember is that a Readable will not generate data
580
until a mechanism for either consuming or ignoring that data is provided. If
581
the consuming mechanism is disabled or taken away, the Readable will *attempt*
582
to stop generating the data.
583
584
*Note*: For backwards compatibility reasons, removing [`'data'`][] event
585
handlers will **not** automatically pause the stream. Also, if there are piped
586
destinations, then calling [`stream.pause()`][stream-pause] will not guarantee
587
that the stream will *remain* paused once those destinations drain and ask for
588
more data.
589
590
*Note*: If a [Readable][] is switched into flowing mode and there are no
591
consumers available to handle the data, that data will be lost. This can occur,
592
for instance, when the `readable.resume()` method is called without a listener
593
attached to the `'data'` event, or when a `'data'` event handler is removed
594
from the stream.
595
596
#### Three States
597
598
The "two modes" of operation for a Readable stream are a simplified abstraction
599
for the more complicated internal state management that is happening within the
600
Readable stream implementation.
601
602
Specifically, at any given point in time, every Readable is in one of three
603
possible states:
604
605
* `readable._readableState.flowing = null`
606
* `readable._readableState.flowing = false`
607
* `readable._readableState.flowing = true`
608
609
When `readable._readableState.flowing` is `null`, no mechanism for consuming the
610
streams data is provided so the stream will not generate its data. While in this
611
state, attaching a listener for the `'data'` event, calling the `readable.pipe()`
612
method, or calling the `readable.resume()` method will switch
613
`readable._readableState.flowing` to `true`, causing the Readable to begin
614
actively emitting events as data is generated.
615
616
Calling `readable.pause()`, `readable.unpipe()`, or receiving "back pressure"
617
will cause the `readable._readableState.flowing` to be set as `false`,
618
temporarily halting the flowing of events but *not* halting the generation of
619
data. While in this state, attaching a listener for the `'data'` event
620
would not cause `readable._readableState.flowing` to switch to `true`.
621
622
```js
623
const { PassThrough, Writable } = require('stream');
624
const pass = new PassThrough();
625
const writable = new Writable();
626
627
pass.pipe(writable);
628
pass.unpipe(writable);
629
// flowing is now false
630
631
pass.on('data', (chunk) => { console.log(chunk.toString()); });
632
pass.write('ok'); // will not emit 'data'
633
pass.resume(); // must be called to make 'data' being emitted
634
```
635
636
While `readable._readableState.flowing` is `false`, data may be accumulating
637
within the streams internal buffer.
638
639
#### Choose One
640
641
The Readable stream API evolved across multiple Node.js versions and provides
642
multiple methods of consuming stream data. In general, developers should choose
643
*one* of the methods of consuming data and *should never* use multiple methods
644
to consume data from a single stream.
645
646
Use of the `readable.pipe()` method is recommended for most users as it has been
647
implemented to provide the easiest way of consuming stream data. Developers that
648
require more fine-grained control over the transfer and generation of data can
649
use the [`EventEmitter`][] and `readable.pause()`/`readable.resume()` APIs.
650
651
#### Class: stream.Readable
652
<!-- YAML
653
added: v0.9.4
654
-->
655
656
<!--type=class-->
657
658
##### Event: 'close'
659
<!-- YAML
660
added: v0.9.4
661
-->
662
663
The `'close'` event is emitted when the stream and any of its underlying
664
resources (a file descriptor, for example) have been closed. The event indicates
665
that no more events will be emitted, and no further computation will occur.
666
667
Not all [Readable][] streams will emit the `'close'` event.
668
669
##### Event: 'data'
670
<!-- YAML
671
added: v0.9.4
672
-->
673
674
* `chunk` {Buffer|string|any} The chunk of data. For streams that are not
675
operating in object mode, the chunk will be either a string or `Buffer`.
676
For streams that are in object mode, the chunk can be any JavaScript value
677
other than `null`.
678
679
The `'data'` event is emitted whenever the stream is relinquishing ownership of
680
a chunk of data to a consumer. This may occur whenever the stream is switched
681
in flowing mode by calling `readable.pipe()`, `readable.resume()`, or by
682
attaching a listener callback to the `'data'` event. The `'data'` event will
683
also be emitted whenever the `readable.read()` method is called and a chunk of
684
data is available to be returned.
685
686
Attaching a `'data'` event listener to a stream that has not been explicitly
687
paused will switch the stream into flowing mode. Data will then be passed as
688
soon as it is available.
689
690
The listener callback will be passed the chunk of data as a string if a default
691
encoding has been specified for the stream using the
692
`readable.setEncoding()` method; otherwise the data will be passed as a
693
`Buffer`.
694
695
```js
696
const readable = getReadableStreamSomehow();
697
readable.on('data', (chunk) => {
698
console.log(`Received ${chunk.length} bytes of data.`);
699
});
700
```
701
702
##### Event: 'end'
703
<!-- YAML
704
added: v0.9.4
705
-->
706
707
The `'end'` event is emitted when there is no more data to be consumed from
708
the stream.
709
710
*Note*: The `'end'` event **will not be emitted** unless the data is
711
completely consumed. This can be accomplished by switching the stream into
712
flowing mode, or by calling [`stream.read()`][stream-read] repeatedly until
713
all data has been consumed.
714
715
```js
716
const readable = getReadableStreamSomehow();
717
readable.on('data', (chunk) => {
718
console.log(`Received ${chunk.length} bytes of data.`);
719
});
720
readable.on('end', () => {
721
console.log('There will be no more data.');
722
});
723
```
724
725
##### Event: 'error'
726
<!-- YAML
727
added: v0.9.4
728
-->
729
730
* {Error}
731
732
The `'error'` event may be emitted by a Readable implementation at any time.
733
Typically, this may occur if the underlying stream is unable to generate data
734
due to an underlying internal failure, or when a stream implementation attempts
735
to push an invalid chunk of data.
736
737
The listener callback will be passed a single `Error` object.
738
739
##### Event: 'readable'
740
<!-- YAML
741
added: v0.9.4
742
-->
743
744
The `'readable'` event is emitted when there is data available to be read from
745
the stream. In some cases, attaching a listener for the `'readable'` event will
746
cause some amount of data to be read into an internal buffer.
747
748
```javascript
749
const readable = getReadableStreamSomehow();
750
readable.on('readable', () => {
751
// there is some data to read now
752
});
753
```
754
The `'readable'` event will also be emitted once the end of the stream data
755
has been reached but before the `'end'` event is emitted.
756
757
Effectively, the `'readable'` event indicates that the stream has new
758
information: either new data is available or the end of the stream has been
759
reached. In the former case, [`stream.read()`][stream-read] will return the
760
available data. In the latter case, [`stream.read()`][stream-read] will return
761
`null`. For instance, in the following example, `foo.txt` is an empty file:
762
763
```js
764
const fs = require('fs');
765
const rr = fs.createReadStream('foo.txt');
766
rr.on('readable', () => {
767
console.log(`readable: ${rr.read()}`);
768
});
769
rr.on('end', () => {
770
console.log('end');
771
});
772
```
773
774
The output of running this script is:
775
776
```txt
777
$ node test.js
778
readable: null
779
end
780
```
781
782
*Note*: In general, the `readable.pipe()` and `'data'` event mechanisms are
783
easier to understand than the `'readable'` event.
784
However, handling `'readable'` might result in increased throughput.
785
786
##### readable.isPaused()
787
<!-- YAML
788
added: v0.11.14
789
-->
790
791
* Returns: {boolean}
792
793
The `readable.isPaused()` method returns the current operating state of the
794
Readable. This is used primarily by the mechanism that underlies the
795
`readable.pipe()` method. In most typical cases, there will be no reason to
796
use this method directly.
797
798
```js
799
const readable = new stream.Readable();
800
801
readable.isPaused(); // === false
802
readable.pause();
803
readable.isPaused(); // === true
804
readable.resume();
805
readable.isPaused(); // === false
806
```
807
808
##### readable.pause()
809
<!-- YAML
810
added: v0.9.4
811
-->
812
813
* Returns: `this`
814
815
The `readable.pause()` method will cause a stream in flowing mode to stop
816
emitting [`'data'`][] events, switching out of flowing mode. Any data that
817
becomes available will remain in the internal buffer.
818
819
```js
820
const readable = getReadableStreamSomehow();
821
readable.on('data', (chunk) => {
822
console.log(`Received ${chunk.length} bytes of data.`);
823
readable.pause();
824
console.log('There will be no additional data for 1 second.');
825
setTimeout(() => {
826
console.log('Now data will start flowing again.');
827
readable.resume();
828
}, 1000);
829
});
830
```
831
832
##### readable.pipe(destination[, options])
833
<!-- YAML
834
added: v0.9.4
835
-->
836
837
* `destination` {stream.Writable} The destination for writing data
838
* `options` {Object} Pipe options
839
* `end` {boolean} End the writer when the reader ends. Defaults to `true`.
840
841
The `readable.pipe()` method attaches a [Writable][] stream to the `readable`,
842
causing it to switch automatically into flowing mode and push all of its data
843
to the attached [Writable][]. The flow of data will be automatically managed so
844
that the destination Writable stream is not overwhelmed by a faster Readable
845
stream.
846
847
The following example pipes all of the data from the `readable` into a file
848
named `file.txt`:
849
850
```js
851
const readable = getReadableStreamSomehow();
852
const writable = fs.createWriteStream('file.txt');
853
// All the data from readable goes into 'file.txt'
854
readable.pipe(writable);
855
```
856
It is possible to attach multiple Writable streams to a single Readable stream.
857
858
The `readable.pipe()` method returns a reference to the *destination* stream
859
making it possible to set up chains of piped streams:
860
861
```js
862
const r = fs.createReadStream('file.txt');
863
const z = zlib.createGzip();
864
const w = fs.createWriteStream('file.txt.gz');
865
r.pipe(z).pipe(w);
866
```
867
868
By default, [`stream.end()`][stream-end] is called on the destination Writable
869
stream when the source Readable stream emits [`'end'`][], so that the
870
destination is no longer writable. To disable this default behavior, the `end`
871
option can be passed as `false`, causing the destination stream to remain open,
872
as illustrated in the following example:
873
874
```js
875
reader.pipe(writer, { end: false });
876
reader.on('end', () => {
877
writer.end('Goodbye\n');
878
});
879
```
880
881
One important caveat is that if the Readable stream emits an error during
882
processing, the Writable destination *is not closed* automatically. If an
883
error occurs, it will be necessary to *manually* close each stream in order
884
to prevent memory leaks.
885
886
*Note*: The [`process.stderr`][] and [`process.stdout`][] Writable streams are
887
never closed until the Node.js process exits, regardless of the specified
888
options.
889
890
##### readable.readableHighWaterMark
891
<!-- YAML
892
added: v9.3.0
893
-->
894
895
Return the value of `highWaterMark` passed when constructing this
896
`Readable`.
897
898
##### readable.read([size])
899
<!-- YAML
900
added: v0.9.4
901
-->
902
903
* `size` {number} Optional argument to specify how much data to read.
904
* Return {string|Buffer|null}
905
906
The `readable.read()` method pulls some data out of the internal buffer and
907
returns it. If no data available to be read, `null` is returned. By default,
908
the data will be returned as a `Buffer` object unless an encoding has been
909
specified using the `readable.setEncoding()` method or the stream is operating
910
in object mode.
911
912
The optional `size` argument specifies a specific number of bytes to read. If
913
`size` bytes are not available to be read, `null` will be returned *unless*
914
the stream has ended, in which case all of the data remaining in the internal
915
buffer will be returned.
916
917
If the `size` argument is not specified, all of the data contained in the
918
internal buffer will be returned.
919
920
The `readable.read()` method should only be called on Readable streams operating
921
in paused mode. In flowing mode, `readable.read()` is called automatically until
922
the internal buffer is fully drained.
923
924
```js
925
const readable = getReadableStreamSomehow();
926
readable.on('readable', () => {
927
let chunk;
928
while (null !== (chunk = readable.read())) {
929
console.log(`Received ${chunk.length} bytes of data.`);
930
}
931
});
932
```
933
934
In general, it is recommended that developers avoid the use of the `'readable'`
935
event and the `readable.read()` method in favor of using either
936
`readable.pipe()` or the `'data'` event.
937
938
A Readable stream in object mode will always return a single item from
939
a call to [`readable.read(size)`][stream-read], regardless of the value of the
940
`size` argument.
941
942
*Note*: If the `readable.read()` method returns a chunk of data, a `'data'`
943
event will also be emitted.
944
945
*Note*: Calling [`stream.read([size])`][stream-read] after the [`'end'`][]
946
event has been emitted will return `null`. No runtime error will be raised.
947
948
##### readable.resume()
949
<!-- YAML
950
added: v0.9.4
951
-->
952
953
* Returns: `this`
954
955
The `readable.resume()` method causes an explicitly paused Readable stream to
956
resume emitting [`'data'`][] events, switching the stream into flowing mode.
957
958
The `readable.resume()` method can be used to fully consume the data from a
959
stream without actually processing any of that data as illustrated in the
960
following example:
961
962
```js
963
getReadableStreamSomehow()
964
.resume()
965
.on('end', () => {
966
console.log('Reached the end, but did not read anything.');
967
});
968
```
969
970
##### readable.setEncoding(encoding)
971
<!-- YAML
972
added: v0.9.4
973
-->
974
975
* `encoding` {string} The encoding to use.
976
* Returns: `this`
977
978
The `readable.setEncoding()` method sets the character encoding for
979
data read from the Readable stream.
980
981
By default, no encoding is assigned and stream data will be returned as
982
`Buffer` objects. Setting an encoding causes the stream data
983
to be returned as strings of the specified encoding rather than as `Buffer`
984
objects. For instance, calling `readable.setEncoding('utf8')` will cause the
985
output data to be interpreted as UTF-8 data, and passed as strings. Calling
986
`readable.setEncoding('hex')` will cause the data to be encoded in hexadecimal
987
string format.
988
989
The Readable stream will properly handle multi-byte characters delivered through
990
the stream that would otherwise become improperly decoded if simply pulled from
991
the stream as `Buffer` objects.
992
993
```js
994
const readable = getReadableStreamSomehow();
995
readable.setEncoding('utf8');
996
readable.on('data', (chunk) => {
997
assert.equal(typeof chunk, 'string');
998
console.log('got %d characters of string data', chunk.length);
999
});
1000
```