-
-
Notifications
You must be signed in to change notification settings - Fork 35.6k
Expand file tree
/
Copy pathstream.md
More file actions
2484 lines (1986 loc) Β· 82.7 KB
/
stream.md
File metadata and controls
2484 lines (1986 loc) Β· 82.7 KB
Edit and raw actions
OlderNewer
Β
1
# Stream
2
3
<!--introduced_in=v0.10.0-->
4
5
> Stability: 2 - Stable
6
7
A stream is an abstract interface for working with streaming data in Node.js.
8
The `stream` module provides a base API that makes it easy to build objects
9
that implement the stream interface.
10
11
There are many stream objects provided by Node.js. For instance, a
12
[request to an HTTP server][http-incoming-message] and [`process.stdout`][]
13
are both stream instances.
14
15
Streams can be readable, writable, or both. All streams are instances of
16
[`EventEmitter`][].
17
18
The `stream` module can be accessed using:
19
20
```js
21
const stream = require('stream');
22
```
23
24
While it is important to understand how streams work, the `stream` module itself
25
is most useful for developers that are creating new types of stream instances.
26
Developers who are primarily *consuming* stream objects will rarely need to use
27
the `stream` module directly.
28
29
## Organization of this Document
30
31
This document is divided into two primary sections with a third section for
32
additional notes. The first section explains the elements of the stream API that
33
are required to *use* streams within an application. The second section explains
34
the elements of the API that are required to *implement* new types of streams.
35
36
## Types of Streams
37
38
There are four fundamental stream types within Node.js:
39
40
* [`Writable`][] - streams to which data can be written (for example,
41
[`fs.createWriteStream()`][]).
42
* [`Readable`][] - streams from which data can be read (for example,
43
[`fs.createReadStream()`][]).
44
* [`Duplex`][] - streams that are both `Readable` and `Writable` (for example,
45
[`net.Socket`][]).
46
* [`Transform`][] - `Duplex` streams that can modify or transform the data as it
47
is written and read (for example, [`zlib.createDeflate()`][]).
48
49
Additionally this module includes the utility functions [pipeline][] and
50
[finished][].
51
52
### Object Mode
53
54
All streams created by Node.js APIs operate exclusively on strings and `Buffer`
55
(or `Uint8Array`) objects. It is possible, however, for stream implementations
56
to work with other types of JavaScript values (with the exception of `null`,
57
which serves a special purpose within streams). Such streams are considered to
58
operate in "object mode".
59
60
Stream instances are switched into object mode using the `objectMode` option
61
when the stream is created. Attempting to switch an existing stream into
62
object mode is not safe.
63
64
### Buffering
65
66
<!--type=misc-->
67
68
Both [`Writable`][] and [`Readable`][] streams will store data in an internal
69
buffer that can be retrieved using `writable.writableBuffer` or
70
`readable.readableBuffer`, respectively.
71
72
The amount of data potentially buffered depends on the `highWaterMark` option
73
passed into the stream's constructor. For normal streams, the `highWaterMark`
74
option specifies a [total number of bytes][hwm-gotcha]. For streams operating
75
in object mode, the `highWaterMark` specifies a total number of objects.
76
77
Data is buffered in `Readable` streams when the implementation calls
78
[`stream.push(chunk)`][stream-push]. If the consumer of the Stream does not
79
call [`stream.read()`][stream-read], the data will sit in the internal
80
queue until it is consumed.
81
82
Once the total size of the internal read buffer reaches the threshold specified
83
by `highWaterMark`, the stream will temporarily stop reading data from the
84
underlying resource until the data currently buffered can be consumed (that is,
85
the stream will stop calling the internal `readable._read()` method that is
86
used to fill the read buffer).
87
88
Data is buffered in `Writable` streams when the
89
[`writable.write(chunk)`][stream-write] method is called repeatedly. While the
90
total size of the internal write buffer is below the threshold set by
91
`highWaterMark`, calls to `writable.write()` will return `true`. Once
92
the size of the internal buffer reaches or exceeds the `highWaterMark`, `false`
93
will be returned.
94
95
A key goal of the `stream` API, particularly the [`stream.pipe()`] method,
96
is to limit the buffering of data to acceptable levels such that sources and
97
destinations of differing speeds will not overwhelm the available memory.
98
99
Because [`Duplex`][] and [`Transform`][] streams are both `Readable` and
100
`Writable`, each maintain *two* separate internal buffers used for reading and
101
writing, allowing each side to operate independently of the other while
102
maintaining an appropriate and efficient flow of data. For example,
103
[`net.Socket`][] instances are [`Duplex`][] streams whose `Readable` side allows
104
consumption of data received *from* the socket and whose `Writable` side allows
105
writing data *to* the socket. Because data may be written to the socket at a
106
faster or slower rate than data is received, it is important for each side to
107
operate (and buffer) independently of the other.
108
109
## API for Stream Consumers
110
111
<!--type=misc-->
112
113
Almost all Node.js applications, no matter how simple, use streams in some
114
manner. The following is an example of using streams in a Node.js application
115
that implements an HTTP server:
116
117
```js
118
const http = require('http');
119
120
const server = http.createServer((req, res) => {
121
// req is an http.IncomingMessage, which is a Readable Stream
122
// res is an http.ServerResponse, which is a Writable Stream
123
124
let body = '';
125
// Get the data as utf8 strings.
126
// If an encoding is not set, Buffer objects will be received.
127
req.setEncoding('utf8');
128
129
// Readable streams emit 'data' events once a listener is added
130
req.on('data', (chunk) => {
131
body += chunk;
132
});
133
134
// the 'end' event indicates that the entire body has been received
135
req.on('end', () => {
136
try {
137
const data = JSON.parse(body);
138
// write back something interesting to the user:
139
res.write(typeof data);
140
res.end();
141
} catch (er) {
142
// uh oh! bad json!
143
res.statusCode = 400;
144
return res.end(`error: ${er.message}`);
145
}
146
});
147
});
148
149
server.listen(1337);
150
151
// $ curl localhost:1337 -d "{}"
152
// object
153
// $ curl localhost:1337 -d "\"foo\""
154
// string
155
// $ curl localhost:1337 -d "not json"
156
// error: Unexpected token o in JSON at position 1
157
```
158
159
[`Writable`][] streams (such as `res` in the example) expose methods such as
160
`write()` and `end()` that are used to write data onto the stream.
161
162
[`Readable`][] streams use the [`EventEmitter`][] API for notifying application
163
code when data is available to be read off the stream. That available data can
164
be read from the stream in multiple ways.
165
166
Both [`Writable`][] and [`Readable`][] streams use the [`EventEmitter`][] API in
167
various ways to communicate the current state of the stream.
168
169
[`Duplex`][] and [`Transform`][] streams are both [`Writable`][] and
170
[`Readable`][].
171
172
Applications that are either writing data to or consuming data from a stream
173
are not required to implement the stream interfaces directly and will generally
174
have no reason to call `require('stream')`.
175
176
Developers wishing to implement new types of streams should refer to the
177
section [API for Stream Implementers][].
178
179
### Writable Streams
180
181
Writable streams are an abstraction for a *destination* to which data is
182
written.
183
184
Examples of [`Writable`][] streams include:
185
186
* [HTTP requests, on the client][]
187
* [HTTP responses, on the server][]
188
* [fs write streams][]
189
* [zlib streams][zlib]
190
* [crypto streams][crypto]
191
* [TCP sockets][]
192
* [child process stdin][]
193
* [`process.stdout`][], [`process.stderr`][]
194
195
Some of these examples are actually [`Duplex`][] streams that implement the
196
[`Writable`][] interface.
197
198
All [`Writable`][] streams implement the interface defined by the
199
`stream.Writable` class.
200
201
While specific instances of [`Writable`][] streams may differ in various ways,
202
all `Writable` streams follow the same fundamental usage pattern as illustrated
203
in the example below:
204
205
```js
206
const myStream = getWritableStreamSomehow();
207
myStream.write('some data');
208
myStream.write('some more data');
209
myStream.end('done writing data');
210
```
211
212
#### Class: stream.Writable
213
<!-- YAML
214
added: v0.9.4
215
-->
216
217
<!--type=class-->
218
219
##### Event: 'close'
220
<!-- YAML
221
added: v0.9.4
222
-->
223
224
The `'close'` event is emitted when the stream and any of its underlying
225
resources (a file descriptor, for example) have been closed. The event indicates
226
that no more events will be emitted, and no further computation will occur.
227
228
Not all `Writable` streams will emit the `'close'` event.
229
230
##### Event: 'drain'
231
<!-- YAML
232
added: v0.9.4
233
-->
234
235
If a call to [`stream.write(chunk)`][stream-write] returns `false`, the
236
`'drain'` event will be emitted when it is appropriate to resume writing data
237
to the stream.
238
239
```js
240
// Write the data to the supplied writable stream one million times.
241
// Be attentive to back-pressure.
242
function writeOneMillionTimes(writer, data, encoding, callback) {
243
let i = 1000000;
244
write();
245
function write() {
246
let ok = true;
247
do {
248
i--;
249
if (i === 0) {
250
// last time!
251
writer.write(data, encoding, callback);
252
} else {
253
// see if we should continue, or wait
254
// don't pass the callback, because we're not done yet.
255
ok = writer.write(data, encoding);
256
}
257
} while (i > 0 && ok);
258
if (i > 0) {
259
// had to stop early!
260
// write some more once it drains
261
writer.once('drain', write);
262
}
263
}
264
}
265
```
266
267
##### Event: 'error'
268
<!-- YAML
269
added: v0.9.4
270
-->
271
272
* {Error}
273
274
The `'error'` event is emitted if an error occurred while writing or piping
275
data. The listener callback is passed a single `Error` argument when called.
276
277
The stream is not closed when the `'error'` event is emitted.
278
279
##### Event: 'finish'
280
<!-- YAML
281
added: v0.9.4
282
-->
283
284
The `'finish'` event is emitted after the [`stream.end()`][stream-end] method
285
has been called, and all data has been flushed to the underlying system.
286
287
```js
288
const writer = getWritableStreamSomehow();
289
for (let i = 0; i < 100; i++) {
290
writer.write(`hello, #${i}!\n`);
291
}
292
writer.end('This is the end\n');
293
writer.on('finish', () => {
294
console.error('All writes are now complete.');
295
});
296
```
297
298
##### Event: 'pipe'
299
<!-- YAML
300
added: v0.9.4
301
-->
302
303
* `src` {stream.Readable} source stream that is piping to this writable
304
305
The `'pipe'` event is emitted when the [`stream.pipe()`][] method is called on
306
a readable stream, adding this writable to its set of destinations.
307
308
```js
309
const writer = getWritableStreamSomehow();
310
const reader = getReadableStreamSomehow();
311
writer.on('pipe', (src) => {
312
console.error('something is piping into the writer');
313
assert.equal(src, reader);
314
});
315
reader.pipe(writer);
316
```
317
318
##### Event: 'unpipe'
319
<!-- YAML
320
added: v0.9.4
321
-->
322
323
* `src` {stream.Readable} The source stream that
324
[unpiped][`stream.unpipe()`] this writable
325
326
The `'unpipe'` event is emitted when the [`stream.unpipe()`][] method is called
327
on a [`Readable`][] stream, removing this [`Writable`][] from its set of
328
destinations.
329
330
This is also emitted in case this [`Writable`][] stream emits an error when a
331
[`Readable`][] stream pipes into it.
332
333
```js
334
const writer = getWritableStreamSomehow();
335
const reader = getReadableStreamSomehow();
336
writer.on('unpipe', (src) => {
337
console.error('Something has stopped piping into the writer.');
338
assert.equal(src, reader);
339
});
340
reader.pipe(writer);
341
reader.unpipe(writer);
342
```
343
344
##### writable.cork()
345
<!-- YAML
346
added: v0.11.2
347
-->
348
349
The `writable.cork()` method forces all written data to be buffered in memory.
350
The buffered data will be flushed when either the [`stream.uncork()`][] or
351
[`stream.end()`][stream-end] methods are called.
352
353
The primary intent of `writable.cork()` is to avoid a situation where writing
354
many small chunks of data to a stream do not cause a backup in the internal
355
buffer that would have an adverse impact on performance. In such situations,
356
implementations that implement the `writable._writev()` method can perform
357
buffered writes in a more optimized manner.
358
359
See also: [`writable.uncork()`][].
360
361
##### writable.destroy([error])
362
<!-- YAML
363
added: v8.0.0
364
-->
365
366
* `error` {Error}
367
* Returns: {this}
368
369
Destroy the stream, and emit the passed `'error'` and a `'close'` event.
370
After this call, the writable stream has ended and subsequent calls
371
to `write()` or `end()` will result in an `ERR_STREAM_DESTROYED` error.
372
Implementors should not override this method,
373
but instead implement [`writable._destroy()`][writable-_destroy].
374
375
##### writable.end([chunk][, encoding][, callback])
376
<!-- YAML
377
added: v0.9.4
378
changes:
379
- version: v10.0.0
380
pr-url: https://github.com/nodejs/node/pull/18780
381
description: This method now returns a reference to `writable`.
382
- version: v8.0.0
383
pr-url: https://github.com/nodejs/node/pull/11608
384
description: The `chunk` argument can now be a `Uint8Array` instance.
385
-->
386
387
* `chunk` {string|Buffer|Uint8Array|any} Optional data to write. For streams
388
not operating in object mode, `chunk` must be a string, `Buffer` or
389
`Uint8Array`. For object mode streams, `chunk` may be any JavaScript value
390
other than `null`.
391
* `encoding` {string} The encoding, if `chunk` is a string
392
* `callback` {Function} Optional callback for when the stream is finished
393
* Returns: {this}
394
395
Calling the `writable.end()` method signals that no more data will be written
396
to the [`Writable`][]. The optional `chunk` and `encoding` arguments allow one
397
final additional chunk of data to be written immediately before closing the
398
stream. If provided, the optional `callback` function is attached as a listener
399
for the [`'finish'`][] event.
400
401
Calling the [`stream.write()`][stream-write] method after calling
402
[`stream.end()`][stream-end] will raise an error.
403
404
```js
405
// write 'hello, ' and then end with 'world!'
406
const fs = require('fs');
407
const file = fs.createWriteStream('example.txt');
408
file.write('hello, ');
409
file.end('world!');
410
// writing more now is not allowed!
411
```
412
413
##### writable.setDefaultEncoding(encoding)
414
<!-- YAML
415
added: v0.11.15
416
changes:
417
- version: v6.1.0
418
pr-url: https://github.com/nodejs/node/pull/5040
419
description: This method now returns a reference to `writable`.
420
-->
421
422
* `encoding` {string} The new default encoding
423
* Returns: {this}
424
425
The `writable.setDefaultEncoding()` method sets the default `encoding` for a
426
[`Writable`][] stream.
427
428
##### writable.uncork()
429
<!-- YAML
430
added: v0.11.2
431
-->
432
433
The `writable.uncork()` method flushes all data buffered since
434
[`stream.cork()`][] was called.
435
436
When using [`writable.cork()`][] and `writable.uncork()` to manage the buffering
437
of writes to a stream, it is recommended that calls to `writable.uncork()` be
438
deferred using `process.nextTick()`. Doing so allows batching of all
439
`writable.write()` calls that occur within a given Node.js event loop phase.
440
441
```js
442
stream.cork();
443
stream.write('some ');
444
stream.write('data ');
445
process.nextTick(() => stream.uncork());
446
```
447
448
If the [`writable.cork()`][] method is called multiple times on a stream, the
449
same number of calls to `writable.uncork()` must be called to flush the buffered
450
data.
451
452
```js
453
stream.cork();
454
stream.write('some ');
455
stream.cork();
456
stream.write('data ');
457
process.nextTick(() => {
458
stream.uncork();
459
// The data will not be flushed until uncork() is called a second time.
460
stream.uncork();
461
});
462
```
463
464
See also: [`writable.cork()`][].
465
466
##### writable.writableHighWaterMark
467
<!-- YAML
468
added: v9.3.0
469
-->
470
471
* {number}
472
473
Return the value of `highWaterMark` passed when constructing this
474
`Writable`.
475
476
##### writable.writableLength
477
<!-- YAML
478
added: v9.4.0
479
-->
480
481
This property contains the number of bytes (or objects) in the queue
482
ready to be written. The value provides introspection data regarding
483
the status of the `highWaterMark`.
484
485
##### writable.write(chunk[, encoding][, callback])
486
<!-- YAML
487
added: v0.9.4
488
changes:
489
- version: v8.0.0
490
pr-url: https://github.com/nodejs/node/pull/11608
491
description: The `chunk` argument can now be a `Uint8Array` instance.
492
- version: v6.0.0
493
pr-url: https://github.com/nodejs/node/pull/6170
494
description: Passing `null` as the `chunk` parameter will always be
495
considered invalid now, even in object mode.
496
-->
497
498
* `chunk` {string|Buffer|Uint8Array|any} Optional data to write. For streams
499
not operating in object mode, `chunk` must be a string, `Buffer` or
500
`Uint8Array`. For object mode streams, `chunk` may be any JavaScript value
501
other than `null`.
502
* `encoding` {string} The encoding, if `chunk` is a string
503
* `callback` {Function} Callback for when this chunk of data is flushed
504
* Returns: {boolean} `false` if the stream wishes for the calling code to
505
wait for the `'drain'` event to be emitted before continuing to write
506
additional data; otherwise `true`.
507
508
The `writable.write()` method writes some data to the stream, and calls the
509
supplied `callback` once the data has been fully handled. If an error
510
occurs, the `callback` *may or may not* be called with the error as its
511
first argument. To reliably detect write errors, add a listener for the
512
`'error'` event.
513
514
The return value is `true` if the internal buffer is less than the
515
`highWaterMark` configured when the stream was created after admitting `chunk`.
516
If `false` is returned, further attempts to write data to the stream should
517
stop until the [`'drain'`][] event is emitted.
518
519
While a stream is not draining, calls to `write()` will buffer `chunk`, and
520
return false. Once all currently buffered chunks are drained (accepted for
521
delivery by the operating system), the `'drain'` event will be emitted.
522
It is recommended that once `write()` returns false, no more chunks be written
523
until the `'drain'` event is emitted. While calling `write()` on a stream that
524
is not draining is allowed, Node.js will buffer all written chunks until
525
maximum memory usage occurs, at which point it will abort unconditionally.
526
Even before it aborts, high memory usage will cause poor garbage collector
527
performance and high RSS (which is not typically released back to the system,
528
even after the memory is no longer required). Since TCP sockets may never
529
drain if the remote peer does not read the data, writing a socket that is
530
not draining may lead to a remotely exploitable vulnerability.
531
532
Writing data while the stream is not draining is particularly
533
problematic for a [`Transform`][], because the `Transform` streams are paused
534
by default until they are piped or an `'data'` or `'readable'` event handler
535
is added.
536
537
If the data to be written can be generated or fetched on demand, it is
538
recommended to encapsulate the logic into a [`Readable`][] and use
539
[`stream.pipe()`][]. However, if calling `write()` is preferred, it is
540
possible to respect backpressure and avoid memory issues using the
541
[`'drain'`][] event:
542
543
```js
544
function write(data, cb) {
545
if (!stream.write(data)) {
546
stream.once('drain', cb);
547
} else {
548
process.nextTick(cb);
549
}
550
}
551
552
// Wait for cb to be called before doing any other write.
553
write('hello', () => {
554
console.log('write completed, do more writes now');
555
});
556
```
557
558
A `Writable` stream in object mode will always ignore the `encoding` argument.
559
560
### Readable Streams
561
562
Readable streams are an abstraction for a *source* from which data is
563
consumed.
564
565
Examples of `Readable` streams include:
566
567
* [HTTP responses, on the client][http-incoming-message]
568
* [HTTP requests, on the server][http-incoming-message]
569
* [fs read streams][]
570
* [zlib streams][zlib]
571
* [crypto streams][crypto]
572
* [TCP sockets][]
573
* [child process stdout and stderr][]
574
* [`process.stdin`][]
575
576
All [`Readable`][] streams implement the interface defined by the
577
`stream.Readable` class.
578
579
#### Two Reading Modes
580
581
`Readable` streams effectively operate in one of two modes: flowing and
582
paused. These modes are separate from [object mode][object-mode].
583
A [`Readable`][] stream can be in object mode or not, regardless of whether
584
it is in flowing mode or paused mode.
585
586
* In flowing mode, data is read from the underlying system automatically
587
and provided to an application as quickly as possible using events via the
588
[`EventEmitter`][] interface.
589
590
* In paused mode, the [`stream.read()`][stream-read] method must be called
591
explicitly to read chunks of data from the stream.
592
593
All [`Readable`][] streams begin in paused mode but can be switched to flowing
594
mode in one of the following ways:
595
596
* Adding a [`'data'`][] event handler.
597
* Calling the [`stream.resume()`][stream-resume] method.
598
* Calling the [`stream.pipe()`][] method to send the data to a [`Writable`][].
599
600
The `Readable` can switch back to paused mode using one of the following:
601
602
* If there are no pipe destinations, by calling the
603
[`stream.pause()`][stream-pause] method.
604
* If there are pipe destinations, by removing all pipe destinations.
605
Multiple pipe destinations may be removed by calling the
606
[`stream.unpipe()`][] method.
607
608
The important concept to remember is that a `Readable` will not generate data
609
until a mechanism for either consuming or ignoring that data is provided. If
610
the consuming mechanism is disabled or taken away, the `Readable` will *attempt*
611
to stop generating the data.
612
613
For backwards compatibility reasons, removing [`'data'`][] event handlers will
614
**not** automatically pause the stream. Also, if there are piped destinations,
615
then calling [`stream.pause()`][stream-pause] will not guarantee that the
616
stream will *remain* paused once those destinations drain and ask for more data.
617
618
If a [`Readable`][] is switched into flowing mode and there are no consumers
619
available to handle the data, that data will be lost. This can occur, for
620
instance, when the `readable.resume()` method is called without a listener
621
attached to the `'data'` event, or when a `'data'` event handler is removed
622
from the stream.
623
624
Adding a [`'readable'`][] event handler automatically make the stream to
625
stop flowing, and the data to be consumed via
626
[`readable.read()`][stream-read]. If the [`'readable'`] event handler is
627
removed, then the stream will start flowing again if there is a
628
[`'data'`][] event handler.
629
630
#### Three States
631
632
The "two modes" of operation for a `Readable` stream are a simplified
633
abstraction for the more complicated internal state management that is happening
634
within the `Readable` stream implementation.
635
636
Specifically, at any given point in time, every `Readable` is in one of three
637
possible states:
638
639
* `readable.readableFlowing === null`
640
* `readable.readableFlowing === false`
641
* `readable.readableFlowing === true`
642
643
When `readable.readableFlowing` is `null`, no mechanism for consuming the
644
stream's data is provided. Therefore, the stream will not generate data.
645
While in this state, attaching a listener for the `'data'` event, calling the
646
`readable.pipe()` method, or calling the `readable.resume()` method will switch
647
`readable.readableFlowing` to `true`, causing the `Readable` to begin actively
648
emitting events as data is generated.
649
650
Calling `readable.pause()`, `readable.unpipe()`, or receiving backpressure
651
will cause the `readable.readableFlowing` to be set as `false`,
652
temporarily halting the flowing of events but *not* halting the generation of
653
data. While in this state, attaching a listener for the `'data'` event
654
will not switch `readable.readableFlowing` to `true`.
655
656
```js
657
const { PassThrough, Writable } = require('stream');
658
const pass = new PassThrough();
659
const writable = new Writable();
660
661
pass.pipe(writable);
662
pass.unpipe(writable);
663
// readableFlowing is now false
664
665
pass.on('data', (chunk) => { console.log(chunk.toString()); });
666
pass.write('ok'); // will not emit 'data'
667
pass.resume(); // must be called to make stream emit 'data'
668
```
669
670
While `readable.readableFlowing` is `false`, data may be accumulating
671
within the stream's internal buffer.
672
673
#### Choose One API Style
674
675
The `Readable` stream API evolved across multiple Node.js versions and provides
676
multiple methods of consuming stream data. In general, developers should choose
677
*one* of the methods of consuming data and *should never* use multiple methods
678
to consume data from a single stream. Specifically, using a combination
679
of `on('data')`, `on('readable')`, `pipe()`, or async iterators could
680
lead to unintuitive behavior.
681
682
Use of the `readable.pipe()` method is recommended for most users as it has been
683
implemented to provide the easiest way of consuming stream data. Developers that
684
require more fine-grained control over the transfer and generation of data can
685
use the [`EventEmitter`][] and `readable.on('readable')`/`readable.read()`
686
or the `readable.pause()`/`readable.resume()` APIs.
687
688
#### Class: stream.Readable
689
<!-- YAML
690
added: v0.9.4
691
-->
692
693
<!--type=class-->
694
695
##### Event: 'close'
696
<!-- YAML
697
added: v0.9.4
698
-->
699
700
The `'close'` event is emitted when the stream and any of its underlying
701
resources (a file descriptor, for example) have been closed. The event indicates
702
that no more events will be emitted, and no further computation will occur.
703
704
Not all [`Readable`][] streams will emit the `'close'` event.
705
706
##### Event: 'data'
707
<!-- YAML
708
added: v0.9.4
709
-->
710
711
* `chunk` {Buffer|string|any} The chunk of data. For streams that are not
712
operating in object mode, the chunk will be either a string or `Buffer`.
713
For streams that are in object mode, the chunk can be any JavaScript value
714
other than `null`.
715
716
The `'data'` event is emitted whenever the stream is relinquishing ownership of
717
a chunk of data to a consumer. This may occur whenever the stream is switched
718
in flowing mode by calling `readable.pipe()`, `readable.resume()`, or by
719
attaching a listener callback to the `'data'` event. The `'data'` event will
720
also be emitted whenever the `readable.read()` method is called and a chunk of
721
data is available to be returned.
722
723
Attaching a `'data'` event listener to a stream that has not been explicitly
724
paused will switch the stream into flowing mode. Data will then be passed as
725
soon as it is available.
726
727
The listener callback will be passed the chunk of data as a string if a default
728
encoding has been specified for the stream using the
729
`readable.setEncoding()` method; otherwise the data will be passed as a
730
`Buffer`.
731
732
```js
733
const readable = getReadableStreamSomehow();
734
readable.on('data', (chunk) => {
735
console.log(`Received ${chunk.length} bytes of data.`);
736
});
737
```
738
739
##### Event: 'end'
740
<!-- YAML
741
added: v0.9.4
742
-->
743
744
The `'end'` event is emitted when there is no more data to be consumed from
745
the stream.
746
747
The `'end'` event **will not be emitted** unless the data is completely
748
consumed. This can be accomplished by switching the stream into flowing mode,
749
or by calling [`stream.read()`][stream-read] repeatedly until all data has been
750
consumed.
751
752
```js
753
const readable = getReadableStreamSomehow();
754
readable.on('data', (chunk) => {
755
console.log(`Received ${chunk.length} bytes of data.`);
756
});
757
readable.on('end', () => {
758
console.log('There will be no more data.');
759
});
760
```
761
762
##### Event: 'error'
763
<!-- YAML
764
added: v0.9.4
765
-->
766
767
* {Error}
768
769
The `'error'` event may be emitted by a `Readable` implementation at any time.
770
Typically, this may occur if the underlying stream is unable to generate data
771
due to an underlying internal failure, or when a stream implementation attempts
772
to push an invalid chunk of data.
773
774
The listener callback will be passed a single `Error` object.
775
776
##### Event: 'readable'
777
<!-- YAML
778
added: v0.9.4
779
changes:
780
- version: v10.0.0
781
pr-url: https://github.com/nodejs/node/pull/17979
782
description: >
783
The `'readable'` is always emitted in the next tick after `.push()`
784
is called
785
- version: v10.0.0
786
pr-url: https://github.com/nodejs/node/pull/18994
787
description: Using `'readable'` requires calling `.read()`.
788
-->
789
790
The `'readable'` event is emitted when there is data available to be read from
791
the stream. In some cases, attaching a listener for the `'readable'` event will
792
cause some amount of data to be read into an internal buffer.
793
794
```javascript
795
const readable = getReadableStreamSomehow();
796
readable.on('readable', function() {
797
// there is some data to read now
798
let data;
799
800
while (data = this.read()) {
801
console.log(data);
802
}
803
});
804
```
805
806
The `'readable'` event will also be emitted once the end of the stream data
807
has been reached but before the `'end'` event is emitted.
808
809
Effectively, the `'readable'` event indicates that the stream has new
810
information: either new data is available or the end of the stream has been
811
reached. In the former case, [`stream.read()`][stream-read] will return the
812
available data. In the latter case, [`stream.read()`][stream-read] will return
813
`null`. For instance, in the following example, `foo.txt` is an empty file:
814
815
```js
816
const fs = require('fs');
817
const rr = fs.createReadStream('foo.txt');
818
rr.on('readable', () => {
819
console.log(`readable: ${rr.read()}`);
820
});
821
rr.on('end', () => {
822
console.log('end');
823
});
824
```
825
826
The output of running this script is:
827
828
```txt
829
$ node test.js
830
readable: null
831
end
832
```
833
834
In general, the `readable.pipe()` and `'data'` event mechanisms are easier to
835
understand than the `'readable'` event. However, handling `'readable'` might
836
result in increased throughput.
837
838
If both `'readable'` and [`'data'`][] are used at the same time, `'readable'`
839
takes precedence in controlling the flow, i.e. `'data'` will be emitted
840
only when [`stream.read()`][stream-read] is called. The
841
`readableFlowing` property would become `false`.
842
If there are `'data'` listeners when `'readable'` is removed, the stream
843
will start flowing, i.e. `'data'`Β events will be emitted without calling
844
`.resume()`.
845
846
##### readable.destroy([error])
847
<!-- YAML
848
added: v8.0.0
849
-->
850
851
* `error` {Error} Error which will be passed as payload in `'error'` event
852
* Returns: {this}
853
854
Destroy the stream, and emit `'error'` and `'close'`. After this call, the
855
readable stream will release any internal resources and subsequent calls
856
to `push()` will be ignored.
857
Implementors should not override this method, but instead implement
858
[`readable._destroy()`][readable-_destroy].
859
860
##### readable.isPaused()
861
<!-- YAML
862
added: v0.11.14
863
-->
864
865
* Returns: {boolean}
866
867
The `readable.isPaused()` method returns the current operating state of the
868
`Readable`. This is used primarily by the mechanism that underlies the
869
`readable.pipe()` method. In most typical cases, there will be no reason to
870
use this method directly.
871
872
```js
873
const readable = new stream.Readable();
874
875
readable.isPaused(); // === false
876
readable.pause();
877
readable.isPaused(); // === true
878
readable.resume();
879
readable.isPaused(); // === false
880
```
881
882
##### readable.pause()
883
<!-- YAML
884
added: v0.9.4
885
-->
886
887
* Returns: {this}
888
889
The `readable.pause()` method will cause a stream in flowing mode to stop
890
emitting [`'data'`][] events, switching out of flowing mode. Any data that
891
becomes available will remain in the internal buffer.
892
893
```js
894
const readable = getReadableStreamSomehow();
895
readable.on('data', (chunk) => {
896
console.log(`Received ${chunk.length} bytes of data.`);
897
readable.pause();
898
console.log('There will be no additional data for 1 second.');
899
setTimeout(() => {
900
console.log('Now data will start flowing again.');
901
readable.resume();
902
}, 1000);
903
});
904
```
905
906
The `readable.pause()` method has no effect if there is a `'readable'`
907
event listener.
908
909
##### readable.pipe(destination[, options])
910
<!-- YAML
911
added: v0.9.4
912
-->
913
914
* `destination` {stream.Writable} The destination for writing data
915
* `options` {Object} Pipe options
916
* `end` {boolean} End the writer when the reader ends. **Default:** `true`.
917
* Returns: {stream.Writable} The *destination*, allowing for a chain of pipes if
918
it is a [`Duplex`][] or a [`Transform`][] stream
919
920
The `readable.pipe()` method attaches a [`Writable`][] stream to the `readable`,
921
causing it to switch automatically into flowing mode and push all of its data
922
to the attached [`Writable`][]. The flow of data will be automatically managed
923
so that the destination `Writable` stream is not overwhelmed by a faster
924
`Readable` stream.
925
926
The following example pipes all of the data from the `readable` into a file
927
named `file.txt`:
928
929
```js
930
const fs = require('fs');
931
const readable = getReadableStreamSomehow();
932
const writable = fs.createWriteStream('file.txt');
933
// All the data from readable goes into 'file.txt'
934
readable.pipe(writable);
935
```
936
It is possible to attach multiple `Writable` streams to a single `Readable`
937
stream.
938
939
The `readable.pipe()` method returns a reference to the *destination* stream
940
making it possible to set up chains of piped streams:
941
942
```js
943
const fs = require('fs');
944
const r = fs.createReadStream('file.txt');
945
const z = zlib.createGzip();
946
const w = fs.createWriteStream('file.txt.gz');
947
r.pipe(z).pipe(w);
948
```
949
950
By default, [`stream.end()`][stream-end] is called on the destination `Writable`
951
stream when the source `Readable` stream emits [`'end'`][], so that the
952
destination is no longer writable. To disable this default behavior, the `end`
953
option can be passed as `false`, causing the destination stream to remain open:
954
955
```js
956
reader.pipe(writer, { end: false });
957
reader.on('end', () => {
958
writer.end('Goodbye\n');
959
});
960
```
961
962
One important caveat is that if the `Readable` stream emits an error during
963
processing, the `Writable` destination *is not closed* automatically. If an
964
error occurs, it will be necessary to *manually* close each stream in order
965
to prevent memory leaks.
966
967
The [`process.stderr`][] and [`process.stdout`][] `Writable` streams are never
968
closed until the Node.js process exits, regardless of the specified options.
969
970
##### readable.read([size])
971
<!-- YAML
972
added: v0.9.4
973
-->
974
975
* `size` {number} Optional argument to specify how much data to read.
976
* Returns: {string|Buffer|null|any}
977
978
The `readable.read()` method pulls some data out of the internal buffer and
979
returns it. If no data available to be read, `null` is returned. By default,
980
the data will be returned as a `Buffer` object unless an encoding has been
981
specified using the `readable.setEncoding()` method or the stream is operating
982
in object mode.
983
984
The optional `size` argument specifies a specific number of bytes to read. If
985
`size` bytes are not available to be read, `null` will be returned *unless*
986
the stream has ended, in which case all of the data remaining in the internal
987
buffer will be returned.
988
989
If the `size` argument is not specified, all of the data contained in the
990
internal buffer will be returned.
991
992
The `readable.read()` method should only be called on `Readable` streams
993
operating in paused mode. In flowing mode, `readable.read()` is called
994
automatically until the internal buffer is fully drained.
995
996
```js
997
const readable = getReadableStreamSomehow();
998
readable.on('readable', () => {
999
let chunk;
1000
while (null !== (chunk = readable.read())) {