-
-
Notifications
You must be signed in to change notification settings - Fork 35.5k
Expand file tree
/
Copy pathstream.md
More file actions
2226 lines (1783 loc) Β· 74 KB
/
stream.md
File metadata and controls
2226 lines (1783 loc) Β· 74 KB
Edit and raw actions
OlderNewer
Β
1
# Stream
2
3
<!--introduced_in=v0.10.0-->
4
5
> Stability: 2 - Stable
6
7
A stream is an abstract interface for working with streaming data in Node.js.
8
The `stream` module provides a base API that makes it easy to build objects
9
that implement the stream interface.
10
11
There are many stream objects provided by Node.js. For instance, a
12
[request to an HTTP server][http-incoming-message] and [`process.stdout`][]
13
are both stream instances.
14
15
Streams can be readable, writable, or both. All streams are instances of
16
[`EventEmitter`][].
17
18
The `stream` module can be accessed using:
19
20
```js
21
const stream = require('stream');
22
```
23
24
While it is important for all Node.js users to understand how streams work,
25
the `stream` module itself is most useful for developers that are creating new
26
types of stream instances. Developers who are primarily *consuming* stream
27
objects will rarely (if ever) have need to use the `stream` module directly.
28
29
## Organization of this Document
30
31
This document is divided into two primary sections with a third section for
32
additional notes. The first section explains the elements of the stream API that
33
are required to *use* streams within an application. The second section explains
34
the elements of the API that are required to *implement* new types of streams.
35
36
## Types of Streams
37
38
There are four fundamental stream types within Node.js:
39
40
* [Readable][] - streams from which data can be read (for example
41
[`fs.createReadStream()`][]).
42
* [Writable][] - streams to which data can be written (for example
43
[`fs.createWriteStream()`][]).
44
* [Duplex][] - streams that are both Readable and Writable (for example
45
[`net.Socket`][]).
46
* [Transform][] - Duplex streams that can modify or transform the data as it
47
is written and read (for example [`zlib.createDeflate()`][]).
48
49
### Object Mode
50
51
All streams created by Node.js APIs operate exclusively on strings and `Buffer`
52
(or `Uint8Array`) objects. It is possible, however, for stream implementations
53
to work with other types of JavaScript values (with the exception of `null`,
54
which serves a special purpose within streams). Such streams are considered to
55
operate in "object mode".
56
57
Stream instances are switched into object mode using the `objectMode` option
58
when the stream is created. Attempting to switch an existing stream into
59
object mode is not safe.
60
61
### Buffering
62
63
<!--type=misc-->
64
65
Both [Writable][] and [Readable][] streams will store data in an internal
66
buffer that can be retrieved using `writable._writableState.getBuffer()` or
67
`readable._readableState.buffer`, respectively.
68
69
The amount of data potentially buffered depends on the `highWaterMark` option
70
passed into the streams constructor. For normal streams, the `highWaterMark`
71
option specifies a [total number of bytes][hwm-gotcha]. For streams operating
72
in object mode, the `highWaterMark` specifies a total number of objects.
73
74
Data is buffered in Readable streams when the implementation calls
75
[`stream.push(chunk)`][stream-push]. If the consumer of the Stream does not
76
call [`stream.read()`][stream-read], the data will sit in the internal
77
queue until it is consumed.
78
79
Once the total size of the internal read buffer reaches the threshold specified
80
by `highWaterMark`, the stream will temporarily stop reading data from the
81
underlying resource until the data currently buffered can be consumed (that is,
82
the stream will stop calling the internal `readable._read()` method that is
83
used to fill the read buffer).
84
85
Data is buffered in Writable streams when the
86
[`writable.write(chunk)`][stream-write] method is called repeatedly. While the
87
total size of the internal write buffer is below the threshold set by
88
`highWaterMark`, calls to `writable.write()` will return `true`. Once
89
the size of the internal buffer reaches or exceeds the `highWaterMark`, `false`
90
will be returned.
91
92
A key goal of the `stream` API, particularly the [`stream.pipe()`] method,
93
is to limit the buffering of data to acceptable levels such that sources and
94
destinations of differing speeds will not overwhelm the available memory.
95
96
Because [Duplex][] and [Transform][] streams are both Readable and Writable,
97
each maintain *two* separate internal buffers used for reading and writing,
98
allowing each side to operate independently of the other while maintaining an
99
appropriate and efficient flow of data. For example, [`net.Socket`][] instances
100
are [Duplex][] streams whose Readable side allows consumption of data received
101
*from* the socket and whose Writable side allows writing data *to* the socket.
102
Because data may be written to the socket at a faster or slower rate than data
103
is received, it is important for each side to operate (and buffer) independently
104
of the other.
105
106
## API for Stream Consumers
107
108
<!--type=misc-->
109
110
Almost all Node.js applications, no matter how simple, use streams in some
111
manner. The following is an example of using streams in a Node.js application
112
that implements an HTTP server:
113
114
```js
115
const http = require('http');
116
117
const server = http.createServer((req, res) => {
118
// req is an http.IncomingMessage, which is a Readable Stream
119
// res is an http.ServerResponse, which is a Writable Stream
120
121
let body = '';
122
// Get the data as utf8 strings.
123
// If an encoding is not set, Buffer objects will be received.
124
req.setEncoding('utf8');
125
126
// Readable streams emit 'data' events once a listener is added
127
req.on('data', (chunk) => {
128
body += chunk;
129
});
130
131
// the end event indicates that the entire body has been received
132
req.on('end', () => {
133
try {
134
const data = JSON.parse(body);
135
// write back something interesting to the user:
136
res.write(typeof data);
137
res.end();
138
} catch (er) {
139
// uh oh! bad json!
140
res.statusCode = 400;
141
return res.end(`error: ${er.message}`);
142
}
143
});
144
});
145
146
server.listen(1337);
147
148
// $ curl localhost:1337 -d "{}"
149
// object
150
// $ curl localhost:1337 -d "\"foo\""
151
// string
152
// $ curl localhost:1337 -d "not json"
153
// error: Unexpected token o in JSON at position 1
154
```
155
156
[Writable][] streams (such as `res` in the example) expose methods such as
157
`write()` and `end()` that are used to write data onto the stream.
158
159
[Readable][] streams use the [`EventEmitter`][] API for notifying application
160
code when data is available to be read off the stream. That available data can
161
be read from the stream in multiple ways.
162
163
Both [Writable][] and [Readable][] streams use the [`EventEmitter`][] API in
164
various ways to communicate the current state of the stream.
165
166
[Duplex][] and [Transform][] streams are both [Writable][] and [Readable][].
167
168
Applications that are either writing data to or consuming data from a stream
169
are not required to implement the stream interfaces directly and will generally
170
have no reason to call `require('stream')`.
171
172
Developers wishing to implement new types of streams should refer to the
173
section [API for Stream Implementers][].
174
175
### Writable Streams
176
177
Writable streams are an abstraction for a *destination* to which data is
178
written.
179
180
Examples of [Writable][] streams include:
181
182
* [HTTP requests, on the client][]
183
* [HTTP responses, on the server][]
184
* [fs write streams][]
185
* [zlib streams][zlib]
186
* [crypto streams][crypto]
187
* [TCP sockets][]
188
* [child process stdin][]
189
* [`process.stdout`][], [`process.stderr`][]
190
191
*Note*: Some of these examples are actually [Duplex][] streams that implement
192
the [Writable][] interface.
193
194
All [Writable][] streams implement the interface defined by the
195
`stream.Writable` class.
196
197
While specific instances of [Writable][] streams may differ in various ways,
198
all Writable streams follow the same fundamental usage pattern as illustrated
199
in the example below:
200
201
```js
202
const myStream = getWritableStreamSomehow();
203
myStream.write('some data');
204
myStream.write('some more data');
205
myStream.end('done writing data');
206
```
207
208
#### Class: stream.Writable
209
<!-- YAML
210
added: v0.9.4
211
-->
212
213
<!--type=class-->
214
215
##### Event: 'close'
216
<!-- YAML
217
added: v0.9.4
218
-->
219
220
The `'close'` event is emitted when the stream and any of its underlying
221
resources (a file descriptor, for example) have been closed. The event indicates
222
that no more events will be emitted, and no further computation will occur.
223
224
Not all Writable streams will emit the `'close'` event.
225
226
##### Event: 'drain'
227
<!-- YAML
228
added: v0.9.4
229
-->
230
231
If a call to [`stream.write(chunk)`][stream-write] returns `false`, the
232
`'drain'` event will be emitted when it is appropriate to resume writing data
233
to the stream.
234
235
```js
236
// Write the data to the supplied writable stream one million times.
237
// Be attentive to back-pressure.
238
function writeOneMillionTimes(writer, data, encoding, callback) {
239
let i = 1000000;
240
write();
241
function write() {
242
let ok = true;
243
do {
244
i--;
245
if (i === 0) {
246
// last time!
247
writer.write(data, encoding, callback);
248
} else {
249
// see if we should continue, or wait
250
// don't pass the callback, because we're not done yet.
251
ok = writer.write(data, encoding);
252
}
253
} while (i > 0 && ok);
254
if (i > 0) {
255
// had to stop early!
256
// write some more once it drains
257
writer.once('drain', write);
258
}
259
}
260
}
261
```
262
263
##### Event: 'error'
264
<!-- YAML
265
added: v0.9.4
266
-->
267
268
* {Error}
269
270
The `'error'` event is emitted if an error occurred while writing or piping
271
data. The listener callback is passed a single `Error` argument when called.
272
273
*Note*: The stream is not closed when the `'error'` event is emitted.
274
275
##### Event: 'finish'
276
<!-- YAML
277
added: v0.9.4
278
-->
279
280
The `'finish'` event is emitted after the [`stream.end()`][stream-end] method
281
has been called, and all data has been flushed to the underlying system.
282
283
```js
284
const writer = getWritableStreamSomehow();
285
for (let i = 0; i < 100; i++) {
286
writer.write(`hello, #${i}!\n`);
287
}
288
writer.end('This is the end\n');
289
writer.on('finish', () => {
290
console.error('All writes are now complete.');
291
});
292
```
293
294
##### Event: 'pipe'
295
<!-- YAML
296
added: v0.9.4
297
-->
298
299
* `src` {stream.Readable} source stream that is piping to this writable
300
301
The `'pipe'` event is emitted when the [`stream.pipe()`][] method is called on
302
a readable stream, adding this writable to its set of destinations.
303
304
```js
305
const writer = getWritableStreamSomehow();
306
const reader = getReadableStreamSomehow();
307
writer.on('pipe', (src) => {
308
console.error('something is piping into the writer');
309
assert.equal(src, reader);
310
});
311
reader.pipe(writer);
312
```
313
314
##### Event: 'unpipe'
315
<!-- YAML
316
added: v0.9.4
317
-->
318
319
* `src` {[Readable][] Stream} The source stream that
320
[unpiped][`stream.unpipe()`] this writable
321
322
The `'unpipe'` event is emitted when the [`stream.unpipe()`][] method is called
323
on a [Readable][] stream, removing this [Writable][] from its set of
324
destinations.
325
326
```js
327
const writer = getWritableStreamSomehow();
328
const reader = getReadableStreamSomehow();
329
writer.on('unpipe', (src) => {
330
console.error('Something has stopped piping into the writer.');
331
assert.equal(src, reader);
332
});
333
reader.pipe(writer);
334
reader.unpipe(writer);
335
```
336
337
##### writable.cork()
338
<!-- YAML
339
added: v0.11.2
340
-->
341
342
The `writable.cork()` method forces all written data to be buffered in memory.
343
The buffered data will be flushed when either the [`stream.uncork()`][] or
344
[`stream.end()`][stream-end] methods are called.
345
346
The primary intent of `writable.cork()` is to avoid a situation where writing
347
many small chunks of data to a stream do not cause a backup in the internal
348
buffer that would have an adverse impact on performance. In such situations,
349
implementations that implement the `writable._writev()` method can perform
350
buffered writes in a more optimized manner.
351
352
See also: [`writable.uncork()`][].
353
354
##### writable.end([chunk][, encoding][, callback])
355
<!-- YAML
356
added: v0.9.4
357
changes:
358
- version: v8.0.0
359
pr-url: https://github.com/nodejs/node/pull/11608
360
description: The `chunk` argument can now be a `Uint8Array` instance.
361
-->
362
363
* `chunk` {string|Buffer|Uint8Array|any} Optional data to write. For streams
364
not operating in object mode, `chunk` must be a string, `Buffer` or
365
`Uint8Array`. For object mode streams, `chunk` may be any JavaScript value
366
other than `null`.
367
* `encoding` {string} The encoding, if `chunk` is a string
368
* `callback` {Function} Optional callback for when the stream is finished
369
370
Calling the `writable.end()` method signals that no more data will be written
371
to the [Writable][]. The optional `chunk` and `encoding` arguments allow one
372
final additional chunk of data to be written immediately before closing the
373
stream. If provided, the optional `callback` function is attached as a listener
374
for the [`'finish'`][] event.
375
376
Calling the [`stream.write()`][stream-write] method after calling
377
[`stream.end()`][stream-end] will raise an error.
378
379
```js
380
// write 'hello, ' and then end with 'world!'
381
const file = fs.createWriteStream('example.txt');
382
file.write('hello, ');
383
file.end('world!');
384
// writing more now is not allowed!
385
```
386
387
##### writable.setDefaultEncoding(encoding)
388
<!-- YAML
389
added: v0.11.15
390
changes:
391
- version: v6.1.0
392
pr-url: https://github.com/nodejs/node/pull/5040
393
description: This method now returns a reference to `writable`.
394
-->
395
396
* `encoding` {string} The new default encoding
397
* Returns: `this`
398
399
The `writable.setDefaultEncoding()` method sets the default `encoding` for a
400
[Writable][] stream.
401
402
##### writable.uncork()
403
<!-- YAML
404
added: v0.11.2
405
-->
406
407
The `writable.uncork()` method flushes all data buffered since
408
[`stream.cork()`][] was called.
409
410
When using [`writable.cork()`][] and `writable.uncork()` to manage the buffering
411
of writes to a stream, it is recommended that calls to `writable.uncork()` be
412
deferred using `process.nextTick()`. Doing so allows batching of all
413
`writable.write()` calls that occur within a given Node.js event loop phase.
414
415
```js
416
stream.cork();
417
stream.write('some ');
418
stream.write('data ');
419
process.nextTick(() => stream.uncork());
420
```
421
422
If the [`writable.cork()`][] method is called multiple times on a stream, the same
423
number of calls to `writable.uncork()` must be called to flush the buffered
424
data.
425
426
```js
427
stream.cork();
428
stream.write('some ');
429
stream.cork();
430
stream.write('data ');
431
process.nextTick(() => {
432
stream.uncork();
433
// The data will not be flushed until uncork() is called a second time.
434
stream.uncork();
435
});
436
```
437
438
See also: [`writable.cork()`][].
439
440
##### writable.write(chunk[, encoding][, callback])
441
<!-- YAML
442
added: v0.9.4
443
changes:
444
- version: v8.0.0
445
pr-url: https://github.com/nodejs/node/pull/11608
446
description: The `chunk` argument can now be a `Uint8Array` instance.
447
- version: v6.0.0
448
pr-url: https://github.com/nodejs/node/pull/6170
449
description: Passing `null` as the `chunk` parameter will always be
450
considered invalid now, even in object mode.
451
-->
452
453
* `chunk` {string|Buffer|Uint8Array|any} Optional data to write. For streams
454
not operating in object mode, `chunk` must be a string, `Buffer` or
455
`Uint8Array`. For object mode streams, `chunk` may be any JavaScript value
456
other than `null`.
457
* `encoding` {string} The encoding, if `chunk` is a string
458
* `callback` {Function} Callback for when this chunk of data is flushed
459
* Returns: {boolean} `false` if the stream wishes for the calling code to
460
wait for the `'drain'` event to be emitted before continuing to write
461
additional data; otherwise `true`.
462
463
The `writable.write()` method writes some data to the stream, and calls the
464
supplied `callback` once the data has been fully handled. If an error
465
occurs, the `callback` *may or may not* be called with the error as its
466
first argument. To reliably detect write errors, add a listener for the
467
`'error'` event.
468
469
The return value is `true` if the internal buffer is less than the
470
`highWaterMark` configured when the stream was created after admitting `chunk`.
471
If `false` is returned, further attempts to write data to the stream should
472
stop until the [`'drain'`][] event is emitted.
473
474
While a stream is not draining, calls to `write()` will buffer `chunk`, and
475
return false. Once all currently buffered chunks are drained (accepted for
476
delivery by the operating system), the `'drain'` event will be emitted.
477
It is recommended that once write() returns false, no more chunks be written
478
until the `'drain'` event is emitted. While calling `write()` on a stream that
479
is not draining is allowed, Node.js will buffer all written chunks until
480
maximum memory usage occurs, at which point it will abort unconditionally.
481
Even before it aborts, high memory usage will cause poor garbage collector
482
performance and high RSS (which is not typically released back to the system,
483
even after the memory is no longer required). Since TCP sockets may never
484
drain if the remote peer does not read the data, writing a socket that is
485
not draining may lead to a remotely exploitable vulnerability.
486
487
Writing data while the stream is not draining is particularly
488
problematic for a [Transform][], because the `Transform` streams are paused
489
by default until they are piped or an `'data'` or `'readable'` event handler
490
is added.
491
492
If the data to be written can be generated or fetched on demand, it is
493
recommended to encapsulate the logic into a [Readable][] and use
494
[`stream.pipe()`][]. However, if calling `write()` is preferred, it is
495
possible to respect backpressure and avoid memory issues using the
496
[`'drain'`][] event:
497
498
```js
499
function write(data, cb) {
500
if (!stream.write(data)) {
501
stream.once('drain', cb);
502
} else {
503
process.nextTick(cb);
504
}
505
}
506
507
// Wait for cb to be called before doing any other write.
508
write('hello', () => {
509
console.log('write completed, do more writes now');
510
});
511
```
512
513
A Writable stream in object mode will always ignore the `encoding` argument.
514
515
##### writable.destroy([error])
516
<!-- YAML
517
added: v8.0.0
518
-->
519
520
* Returns: `this`
521
522
Destroy the stream, and emit the passed error. After this call, the
523
writable stream has ended. Implementors should not override this method,
524
but instead implement [`writable._destroy`][writable-_destroy].
525
526
### Readable Streams
527
528
Readable streams are an abstraction for a *source* from which data is
529
consumed.
530
531
Examples of Readable streams include:
532
533
* [HTTP responses, on the client][http-incoming-message]
534
* [HTTP requests, on the server][http-incoming-message]
535
* [fs read streams][]
536
* [zlib streams][zlib]
537
* [crypto streams][crypto]
538
* [TCP sockets][]
539
* [child process stdout and stderr][]
540
* [`process.stdin`][]
541
542
All [Readable][] streams implement the interface defined by the
543
`stream.Readable` class.
544
545
#### Two Modes
546
547
Readable streams effectively operate in one of two modes: flowing and paused.
548
549
When in flowing mode, data is read from the underlying system automatically
550
and provided to an application as quickly as possible using events via the
551
[`EventEmitter`][] interface.
552
553
In paused mode, the [`stream.read()`][stream-read] method must be called
554
explicitly to read chunks of data from the stream.
555
556
All [Readable][] streams begin in paused mode but can be switched to flowing
557
mode in one of the following ways:
558
559
* Adding a [`'data'`][] event handler.
560
* Calling the [`stream.resume()`][stream-resume] method.
561
* Calling the [`stream.pipe()`][] method to send the data to a [Writable][].
562
563
The Readable can switch back to paused mode using one of the following:
564
565
* If there are no pipe destinations, by calling the
566
[`stream.pause()`][stream-pause] method.
567
* If there are pipe destinations, by removing any [`'data'`][] event
568
handlers, and removing all pipe destinations by calling the
569
[`stream.unpipe()`][] method.
570
571
The important concept to remember is that a Readable will not generate data
572
until a mechanism for either consuming or ignoring that data is provided. If
573
the consuming mechanism is disabled or taken away, the Readable will *attempt*
574
to stop generating the data.
575
576
*Note*: For backwards compatibility reasons, removing [`'data'`][] event
577
handlers will **not** automatically pause the stream. Also, if there are piped
578
destinations, then calling [`stream.pause()`][stream-pause] will not guarantee
579
that the stream will *remain* paused once those destinations drain and ask for
580
more data.
581
582
*Note*: If a [Readable][] is switched into flowing mode and there are no
583
consumers available to handle the data, that data will be lost. This can occur,
584
for instance, when the `readable.resume()` method is called without a listener
585
attached to the `'data'` event, or when a `'data'` event handler is removed
586
from the stream.
587
588
#### Three States
589
590
The "two modes" of operation for a Readable stream are a simplified abstraction
591
for the more complicated internal state management that is happening within the
592
Readable stream implementation.
593
594
Specifically, at any given point in time, every Readable is in one of three
595
possible states:
596
597
* `readable._readableState.flowing = null`
598
* `readable._readableState.flowing = false`
599
* `readable._readableState.flowing = true`
600
601
When `readable._readableState.flowing` is `null`, no mechanism for consuming the
602
streams data is provided so the stream will not generate its data. While in this
603
state, attaching a listener for the `'data'` event, calling the `readable.pipe()`
604
method, or calling the `readable.resume()` method will switch
605
`readable._readableState.flowing` to `true`, causing the Readable to begin
606
actively emitting events as data is generated.
607
608
Calling `readable.pause()`, `readable.unpipe()`, or receiving "back pressure"
609
will cause the `readable._readableState.flowing` to be set as `false`,
610
temporarily halting the flowing of events but *not* halting the generation of
611
data. While in this state, attaching a listener for the `'data'` event
612
would not cause `readable._readableState.flowing` to switch to `true`.
613
614
```js
615
const { PassThrough, Writable } = require('stream');
616
const pass = new PassThrough();
617
const writable = new Writable();
618
619
pass.pipe(writable);
620
pass.unpipe(writable);
621
// flowing is now false
622
623
pass.on('data', (chunk) => { console.log(chunk.toString()); });
624
pass.write('ok'); // will not emit 'data'
625
pass.resume(); // must be called to make 'data' being emitted
626
```
627
628
While `readable._readableState.flowing` is `false`, data may be accumulating
629
within the streams internal buffer.
630
631
#### Choose One
632
633
The Readable stream API evolved across multiple Node.js versions and provides
634
multiple methods of consuming stream data. In general, developers should choose
635
*one* of the methods of consuming data and *should never* use multiple methods
636
to consume data from a single stream.
637
638
Use of the `readable.pipe()` method is recommended for most users as it has been
639
implemented to provide the easiest way of consuming stream data. Developers that
640
require more fine-grained control over the transfer and generation of data can
641
use the [`EventEmitter`][] and `readable.pause()`/`readable.resume()` APIs.
642
643
#### Class: stream.Readable
644
<!-- YAML
645
added: v0.9.4
646
-->
647
648
<!--type=class-->
649
650
##### Event: 'close'
651
<!-- YAML
652
added: v0.9.4
653
-->
654
655
The `'close'` event is emitted when the stream and any of its underlying
656
resources (a file descriptor, for example) have been closed. The event indicates
657
that no more events will be emitted, and no further computation will occur.
658
659
Not all [Readable][] streams will emit the `'close'` event.
660
661
##### Event: 'data'
662
<!-- YAML
663
added: v0.9.4
664
-->
665
666
* `chunk` {Buffer|string|any} The chunk of data. For streams that are not
667
operating in object mode, the chunk will be either a string or `Buffer`.
668
For streams that are in object mode, the chunk can be any JavaScript value
669
other than `null`.
670
671
The `'data'` event is emitted whenever the stream is relinquishing ownership of
672
a chunk of data to a consumer. This may occur whenever the stream is switched
673
in flowing mode by calling `readable.pipe()`, `readable.resume()`, or by
674
attaching a listener callback to the `'data'` event. The `'data'` event will
675
also be emitted whenever the `readable.read()` method is called and a chunk of
676
data is available to be returned.
677
678
Attaching a `'data'` event listener to a stream that has not been explicitly
679
paused will switch the stream into flowing mode. Data will then be passed as
680
soon as it is available.
681
682
The listener callback will be passed the chunk of data as a string if a default
683
encoding has been specified for the stream using the
684
`readable.setEncoding()` method; otherwise the data will be passed as a
685
`Buffer`.
686
687
```js
688
const readable = getReadableStreamSomehow();
689
readable.on('data', (chunk) => {
690
console.log(`Received ${chunk.length} bytes of data.`);
691
});
692
```
693
694
##### Event: 'end'
695
<!-- YAML
696
added: v0.9.4
697
-->
698
699
The `'end'` event is emitted when there is no more data to be consumed from
700
the stream.
701
702
*Note*: The `'end'` event **will not be emitted** unless the data is
703
completely consumed. This can be accomplished by switching the stream into
704
flowing mode, or by calling [`stream.read()`][stream-read] repeatedly until
705
all data has been consumed.
706
707
```js
708
const readable = getReadableStreamSomehow();
709
readable.on('data', (chunk) => {
710
console.log(`Received ${chunk.length} bytes of data.`);
711
});
712
readable.on('end', () => {
713
console.log('There will be no more data.');
714
});
715
```
716
717
##### Event: 'error'
718
<!-- YAML
719
added: v0.9.4
720
-->
721
722
* {Error}
723
724
The `'error'` event may be emitted by a Readable implementation at any time.
725
Typically, this may occur if the underlying stream is unable to generate data
726
due to an underlying internal failure, or when a stream implementation attempts
727
to push an invalid chunk of data.
728
729
The listener callback will be passed a single `Error` object.
730
731
##### Event: 'readable'
732
<!-- YAML
733
added: v0.9.4
734
-->
735
736
The `'readable'` event is emitted when there is data available to be read from
737
the stream. In some cases, attaching a listener for the `'readable'` event will
738
cause some amount of data to be read into an internal buffer.
739
740
```javascript
741
const readable = getReadableStreamSomehow();
742
readable.on('readable', () => {
743
// there is some data to read now
744
});
745
```
746
The `'readable'` event will also be emitted once the end of the stream data
747
has been reached but before the `'end'` event is emitted.
748
749
Effectively, the `'readable'` event indicates that the stream has new
750
information: either new data is available or the end of the stream has been
751
reached. In the former case, [`stream.read()`][stream-read] will return the
752
available data. In the latter case, [`stream.read()`][stream-read] will return
753
`null`. For instance, in the following example, `foo.txt` is an empty file:
754
755
```js
756
const fs = require('fs');
757
const rr = fs.createReadStream('foo.txt');
758
rr.on('readable', () => {
759
console.log('readable:', rr.read());
760
});
761
rr.on('end', () => {
762
console.log('end');
763
});
764
```
765
766
The output of running this script is:
767
768
```txt
769
$ node test.js
770
readable: null
771
end
772
```
773
774
*Note*: In general, the `readable.pipe()` and `'data'` event mechanisms are
775
easier to understand than the `'readable'` event.
776
However, handling `'readable'` might result in increased throughput.
777
778
##### readable.isPaused()
779
<!-- YAML
780
added: v0.11.14
781
-->
782
783
* Returns: {boolean}
784
785
The `readable.isPaused()` method returns the current operating state of the
786
Readable. This is used primarily by the mechanism that underlies the
787
`readable.pipe()` method. In most typical cases, there will be no reason to
788
use this method directly.
789
790
```js
791
const readable = new stream.Readable();
792
793
readable.isPaused(); // === false
794
readable.pause();
795
readable.isPaused(); // === true
796
readable.resume();
797
readable.isPaused(); // === false
798
```
799
800
##### readable.pause()
801
<!-- YAML
802
added: v0.9.4
803
-->
804
805
* Returns: `this`
806
807
The `readable.pause()` method will cause a stream in flowing mode to stop
808
emitting [`'data'`][] events, switching out of flowing mode. Any data that
809
becomes available will remain in the internal buffer.
810
811
```js
812
const readable = getReadableStreamSomehow();
813
readable.on('data', (chunk) => {
814
console.log(`Received ${chunk.length} bytes of data.`);
815
readable.pause();
816
console.log('There will be no additional data for 1 second.');
817
setTimeout(() => {
818
console.log('Now data will start flowing again.');
819
readable.resume();
820
}, 1000);
821
});
822
```
823
824
##### readable.pipe(destination[, options])
825
<!-- YAML
826
added: v0.9.4
827
-->
828
829
* `destination` {stream.Writable} The destination for writing data
830
* `options` {Object} Pipe options
831
* `end` {boolean} End the writer when the reader ends. Defaults to `true`.
832
833
The `readable.pipe()` method attaches a [Writable][] stream to the `readable`,
834
causing it to switch automatically into flowing mode and push all of its data
835
to the attached [Writable][]. The flow of data will be automatically managed so
836
that the destination Writable stream is not overwhelmed by a faster Readable
837
stream.
838
839
The following example pipes all of the data from the `readable` into a file
840
named `file.txt`:
841
842
```js
843
const readable = getReadableStreamSomehow();
844
const writable = fs.createWriteStream('file.txt');
845
// All the data from readable goes into 'file.txt'
846
readable.pipe(writable);
847
```
848
It is possible to attach multiple Writable streams to a single Readable stream.
849
850
The `readable.pipe()` method returns a reference to the *destination* stream
851
making it possible to set up chains of piped streams:
852
853
```js
854
const r = fs.createReadStream('file.txt');
855
const z = zlib.createGzip();
856
const w = fs.createWriteStream('file.txt.gz');
857
r.pipe(z).pipe(w);
858
```
859
860
By default, [`stream.end()`][stream-end] is called on the destination Writable
861
stream when the source Readable stream emits [`'end'`][], so that the
862
destination is no longer writable. To disable this default behavior, the `end`
863
option can be passed as `false`, causing the destination stream to remain open,
864
as illustrated in the following example:
865
866
```js
867
reader.pipe(writer, { end: false });
868
reader.on('end', () => {
869
writer.end('Goodbye\n');
870
});
871
```
872
873
One important caveat is that if the Readable stream emits an error during
874
processing, the Writable destination *is not closed* automatically. If an
875
error occurs, it will be necessary to *manually* close each stream in order
876
to prevent memory leaks.
877
878
*Note*: The [`process.stderr`][] and [`process.stdout`][] Writable streams are
879
never closed until the Node.js process exits, regardless of the specified
880
options.
881
882
##### readable.read([size])
883
<!-- YAML
884
added: v0.9.4
885
-->
886
887
* `size` {number} Optional argument to specify how much data to read.
888
* Return {string|Buffer|null}
889
890
The `readable.read()` method pulls some data out of the internal buffer and
891
returns it. If no data available to be read, `null` is returned. By default,
892
the data will be returned as a `Buffer` object unless an encoding has been
893
specified using the `readable.setEncoding()` method or the stream is operating
894
in object mode.
895
896
The optional `size` argument specifies a specific number of bytes to read. If
897
`size` bytes are not available to be read, `null` will be returned *unless*
898
the stream has ended, in which case all of the data remaining in the internal
899
buffer will be returned.
900
901
If the `size` argument is not specified, all of the data contained in the
902
internal buffer will be returned.
903
904
The `readable.read()` method should only be called on Readable streams operating
905
in paused mode. In flowing mode, `readable.read()` is called automatically until
906
the internal buffer is fully drained.
907
908
```js
909
const readable = getReadableStreamSomehow();
910
readable.on('readable', () => {
911
let chunk;
912
while (null !== (chunk = readable.read())) {
913
console.log(`Received ${chunk.length} bytes of data.`);
914
}
915
});
916
```
917
918
In general, it is recommended that developers avoid the use of the `'readable'`
919
event and the `readable.read()` method in favor of using either
920
`readable.pipe()` or the `'data'` event.
921
922
A Readable stream in object mode will always return a single item from
923
a call to [`readable.read(size)`][stream-read], regardless of the value of the
924
`size` argument.
925
926
*Note*: If the `readable.read()` method returns a chunk of data, a `'data'`
927
event will also be emitted.
928
929
*Note*: Calling [`stream.read([size])`][stream-read] after the [`'end'`][]
930
event has been emitted will return `null`. No runtime error will be raised.
931
932
##### readable.resume()
933
<!-- YAML
934
added: v0.9.4
935
-->
936
937
* Returns: `this`
938
939
The `readable.resume()` method causes an explicitly paused Readable stream to
940
resume emitting [`'data'`][] events, switching the stream into flowing mode.
941
942
The `readable.resume()` method can be used to fully consume the data from a
943
stream without actually processing any of that data as illustrated in the
944
following example:
945
946
```js
947
getReadableStreamSomehow()
948
.resume()
949
.on('end', () => {
950
console.log('Reached the end, but did not read anything.');
951
});
952
```
953
954
##### readable.setEncoding(encoding)
955
<!-- YAML
956
added: v0.9.4
957
-->
958
959
* `encoding` {string} The encoding to use.
960
* Returns: `this`
961
962
The `readable.setEncoding()` method sets the character encoding for
963
data read from the Readable stream.
964
965
By default, no encoding is assigned and stream data will be returned as
966
`Buffer` objects. Setting an encoding causes the stream data
967
to be returned as strings of the specified encoding rather than as `Buffer`
968
objects. For instance, calling `readable.setEncoding('utf8')` will cause the
969
output data to be interpreted as UTF-8 data, and passed as strings. Calling
970
`readable.setEncoding('hex')` will cause the data to be encoded in hexadecimal
971
string format.
972
973
The Readable stream will properly handle multi-byte characters delivered through
974
the stream that would otherwise become improperly decoded if simply pulled from
975
the stream as `Buffer` objects.
976
977
```js
978
const readable = getReadableStreamSomehow();
979
readable.setEncoding('utf8');
980
readable.on('data', (chunk) => {
981
assert.equal(typeof chunk, 'string');
982
console.log('got %d characters of string data', chunk.length);
983
});
984
```
985
986
##### readable.unpipe([destination])
987
<!-- YAML
988
added: v0.9.4
989
-->
990
991
* `destination` {stream.Writable} Optional specific stream to unpipe
992
993
The `readable.unpipe()` method detaches a Writable stream previously attached
994
using the [`stream.pipe()`][] method.
995
996
If the `destination` is not specified, then *all* pipes are detached.
997
998
If the `destination` is specified, but no pipe is set up for it, then
999
the method does nothing.
1000