-
-
Notifications
You must be signed in to change notification settings - Fork 35.5k
Expand file tree
/
Copy pathstream.md
More file actions
2078 lines (1654 loc) Β· 68.6 KB
/
stream.md
File metadata and controls
2078 lines (1654 loc) Β· 68.6 KB
Edit and raw actions
OlderNewer
Β
1
# Stream
2
3
> Stability: 2 - Stable
4
5
A stream is an abstract interface for working with streaming data in Node.js.
6
The `stream` module provides a base API that makes it easy to build objects
7
that implement the stream interface.
8
9
There are many stream objects provided by Node.js. For instance, a
10
[request to an HTTP server][http-incoming-message] and [`process.stdout`][]
11
are both stream instances.
12
13
Streams can be readable, writable, or both. All streams are instances of
14
[`EventEmitter`][].
15
16
The `stream` module can be accessed using:
17
18
```js
19
const stream = require('stream');
20
```
21
22
While it is important for all Node.js users to understand how streams work,
23
the `stream` module itself is most useful for developers that are creating new
24
types of stream instances. Developers who are primarily *consuming* stream
25
objects will rarely (if ever) have need to use the `stream` module directly.
26
27
## Organization of this Document
28
29
This document is divided into two primary sections with a third section for
30
additional notes. The first section explains the elements of the stream API that
31
are required to *use* streams within an application. The second section explains
32
the elements of the API that are required to *implement* new types of streams.
33
34
## Types of Streams
35
36
There are four fundamental stream types within Node.js:
37
38
* [Readable][] - streams from which data can be read (for example
39
[`fs.createReadStream()`][]).
40
* [Writable][] - streams to which data can be written (for example
41
[`fs.createWriteStream()`][]).
42
* [Duplex][] - streams that are both Readable and Writable (for example
43
[`net.Socket`][]).
44
* [Transform][] - Duplex streams that can modify or transform the data as it
45
is written and read (for example [`zlib.createDeflate()`][]).
46
47
### Object Mode
48
49
All streams created by Node.js APIs operate exclusively on strings and `Buffer`
50
objects. It is possible, however, for stream implementations to work with other
51
types of JavaScript values (with the exception of `null`, which serves a special
52
purpose within streams). Such streams are considered to operate in "object
53
mode".
54
55
Stream instances are switched into object mode using the `objectMode` option
56
when the stream is created. Attempting to switch an existing stream into
57
object mode is not safe.
58
59
### Buffering
60
61
<!--type=misc-->
62
63
Both [Writable][] and [Readable][] streams will store data in an internal
64
buffer that can be retrieved using `writable._writableState.getBuffer()` or
65
`readable._readableState.buffer`, respectively.
66
67
The amount of data potentially buffered depends on the `highWaterMark` option
68
passed into the streams constructor. For normal streams, the `highWaterMark`
69
option specifies a total number of bytes. For streams operating in object mode,
70
the `highWaterMark` specifies a total number of objects.
71
72
Data is buffered in Readable streams when the implementation calls
73
[`stream.push(chunk)`][stream-push]. If the consumer of the Stream does not
74
call [`stream.read()`][stream-read], the data will sit in the internal
75
queue until it is consumed.
76
77
Once the total size of the internal read buffer reaches the threshold specified
78
by `highWaterMark`, the stream will temporarily stop reading data from the
79
underlying resource until the data currently buffered can be consumed (that is,
80
the stream will stop calling the internal `readable._read()` method that is
81
used to fill the read buffer).
82
83
Data is buffered in Writable streams when the
84
[`writable.write(chunk)`][stream-write] method is called repeatedly. While the
85
total size of the internal write buffer is below the threshold set by
86
`highWaterMark`, calls to `writable.write()` will return `true`. Once
87
the size of the internal buffer reaches or exceeds the `highWaterMark`, `false`
88
will be returned.
89
90
A key goal of the `stream` API, particularly the [`stream.pipe()`] method,
91
is to limit the buffering of data to acceptable levels such that sources and
92
destinations of differing speeds will not overwhelm the available memory.
93
94
Because [Duplex][] and [Transform][] streams are both Readable and Writable,
95
each maintain *two* separate internal buffers used for reading and writing,
96
allowing each side to operate independently of the other while maintaining an
97
appropriate and efficient flow of data. For example, [`net.Socket`][] instances
98
are [Duplex][] streams whose Readable side allows consumption of data received
99
*from* the socket and whose Writable side allows writing data *to* the socket.
100
Because data may be written to the socket at a faster or slower rate than data
101
is received, it is important for each side to operate (and buffer) independently
102
of the other.
103
104
## API for Stream Consumers
105
106
<!--type=misc-->
107
108
Almost all Node.js applications, no matter how simple, use streams in some
109
manner. The following is an example of using streams in a Node.js application
110
that implements an HTTP server:
111
112
```js
113
const http = require('http');
114
115
const server = http.createServer( (req, res) => {
116
// req is an http.IncomingMessage, which is a Readable Stream
117
// res is an http.ServerResponse, which is a Writable Stream
118
119
let body = '';
120
// Get the data as utf8 strings.
121
// If an encoding is not set, Buffer objects will be received.
122
req.setEncoding('utf8');
123
124
// Readable streams emit 'data' events once a listener is added
125
req.on('data', (chunk) => {
126
body += chunk;
127
});
128
129
// the end event indicates that the entire body has been received
130
req.on('end', () => {
131
try {
132
const data = JSON.parse(body);
133
// write back something interesting to the user:
134
res.write(typeof data);
135
res.end();
136
} catch (er) {
137
// uh oh! bad json!
138
res.statusCode = 400;
139
return res.end(`error: ${er.message}`);
140
}
141
});
142
});
143
144
server.listen(1337);
145
146
// $ curl localhost:1337 -d '{}'
147
// object
148
// $ curl localhost:1337 -d '"foo"'
149
// string
150
// $ curl localhost:1337 -d 'not json'
151
// error: Unexpected token o
152
```
153
154
[Writable][] streams (such as `res` in the example) expose methods such as
155
`write()` and `end()` that are used to write data onto the stream.
156
157
[Readable][] streams use the [`EventEmitter`][] API for notifying application
158
code when data is available to be read off the stream. That available data can
159
be read from the stream in multiple ways.
160
161
Both [Writable][] and [Readable][] streams use the [`EventEmitter`][] API in
162
various ways to communicate the current state of the stream.
163
164
[Duplex][] and [Transform][] streams are both [Writable][] and [Readable][].
165
166
Applications that are either writing data to or consuming data from a stream
167
are not required to implement the stream interfaces directly and will generally
168
have no reason to call `require('stream')`.
169
170
Developers wishing to implement new types of streams should refer to the
171
section [API for Stream Implementers][].
172
173
### Writable Streams
174
175
Writable streams are an abstraction for a *destination* to which data is
176
written.
177
178
Examples of [Writable][] streams include:
179
180
* [HTTP requests, on the client][]
181
* [HTTP responses, on the server][]
182
* [fs write streams][]
183
* [zlib streams][zlib]
184
* [crypto streams][crypto]
185
* [TCP sockets][]
186
* [child process stdin][]
187
* [`process.stdout`][], [`process.stderr`][]
188
189
*Note*: Some of these examples are actually [Duplex][] streams that implement
190
the [Writable][] interface.
191
192
All [Writable][] streams implement the interface defined by the
193
`stream.Writable` class.
194
195
While specific instances of [Writable][] streams may differ in various ways,
196
all Writable streams follow the same fundamental usage pattern as illustrated
197
in the example below:
198
199
```js
200
const myStream = getWritableStreamSomehow();
201
myStream.write('some data');
202
myStream.write('some more data');
203
myStream.end('done writing data');
204
```
205
206
#### Class: stream.Writable
207
<!-- YAML
208
added: v0.9.4
209
-->
210
211
<!--type=class-->
212
213
##### Event: 'close'
214
<!-- YAML
215
added: v0.9.4
216
-->
217
218
The `'close'` event is emitted when the stream and any of its underlying
219
resources (a file descriptor, for example) have been closed. The event indicates
220
that no more events will be emitted, and no further computation will occur.
221
222
Not all Writable streams will emit the `'close'` event.
223
224
##### Event: 'drain'
225
<!-- YAML
226
added: v0.9.4
227
-->
228
229
If a call to [`stream.write(chunk)`][stream-write] returns `false`, the
230
`'drain'` event will be emitted when it is appropriate to resume writing data
231
to the stream.
232
233
```js
234
// Write the data to the supplied writable stream one million times.
235
// Be attentive to back-pressure.
236
function writeOneMillionTimes(writer, data, encoding, callback) {
237
let i = 1000000;
238
write();
239
function write() {
240
var ok = true;
241
do {
242
i--;
243
if (i === 0) {
244
// last time!
245
writer.write(data, encoding, callback);
246
} else {
247
// see if we should continue, or wait
248
// don't pass the callback, because we're not done yet.
249
ok = writer.write(data, encoding);
250
}
251
} while (i > 0 && ok);
252
if (i > 0) {
253
// had to stop early!
254
// write some more once it drains
255
writer.once('drain', write);
256
}
257
}
258
}
259
```
260
261
##### Event: 'error'
262
<!-- YAML
263
added: v0.9.4
264
-->
265
266
* {Error}
267
268
The `'error'` event is emitted if an error occurred while writing or piping
269
data. The listener callback is passed a single `Error` argument when called.
270
271
*Note*: The stream is not closed when the `'error'` event is emitted.
272
273
##### Event: 'finish'
274
<!-- YAML
275
added: v0.9.4
276
-->
277
278
The `'finish'` event is emitted after the [`stream.end()`][stream-end] method
279
has been called, and all data has been flushed to the underlying system.
280
281
```js
282
const writer = getWritableStreamSomehow();
283
for (var i = 0; i < 100; i ++) {
284
writer.write(`hello, #${i}!\n`);
285
}
286
writer.end('This is the end\n');
287
writer.on('finish', () => {
288
console.error('All writes are now complete.');
289
});
290
```
291
292
##### Event: 'pipe'
293
<!-- YAML
294
added: v0.9.4
295
-->
296
297
* `src` {stream.Readable} source stream that is piping to this writable
298
299
The `'pipe'` event is emitted when the [`stream.pipe()`][] method is called on
300
a readable stream, adding this writable to its set of destinations.
301
302
```js
303
const writer = getWritableStreamSomehow();
304
const reader = getReadableStreamSomehow();
305
writer.on('pipe', (src) => {
306
console.error('something is piping into the writer');
307
assert.equal(src, reader);
308
});
309
reader.pipe(writer);
310
```
311
312
##### Event: 'unpipe'
313
<!-- YAML
314
added: v0.9.4
315
-->
316
317
* `src` {[Readable][] Stream} The source stream that
318
[unpiped][`stream.unpipe()`] this writable
319
320
The `'unpipe'` event is emitted when the [`stream.unpipe()`][] method is called
321
on a [Readable][] stream, removing this [Writable][] from its set of
322
destinations.
323
324
```js
325
const writer = getWritableStreamSomehow();
326
const reader = getReadableStreamSomehow();
327
writer.on('unpipe', (src) => {
328
console.error('Something has stopped piping into the writer.');
329
assert.equal(src, reader);
330
});
331
reader.pipe(writer);
332
reader.unpipe(writer);
333
```
334
335
##### writable.cork()
336
<!-- YAML
337
added: v0.11.2
338
-->
339
340
The `writable.cork()` method forces all written data to be buffered in memory.
341
The buffered data will be flushed when either the [`stream.uncork()`][] or
342
[`stream.end()`][stream-end] methods are called.
343
344
The primary intent of `writable.cork()` is to avoid a situation where writing
345
many small chunks of data to a stream do not cause a backup in the internal
346
buffer that would have an adverse impact on performance. In such situations,
347
implementations that implement the `writable._writev()` method can perform
348
buffered writes in a more optimized manner.
349
350
See also: [`writable.uncork()`][].
351
352
##### writable.end([chunk][, encoding][, callback])
353
<!-- YAML
354
added: v0.9.4
355
-->
356
357
* `chunk` {string|Buffer|any} Optional data to write. For streams not operating
358
in object mode, `chunk` must be a string or a `Buffer`. For object mode
359
streams, `chunk` may be any JavaScript value other than `null`.
360
* `encoding` {string} The encoding, if `chunk` is a String
361
* `callback` {Function} Optional callback for when the stream is finished
362
363
Calling the `writable.end()` method signals that no more data will be written
364
to the [Writable][]. The optional `chunk` and `encoding` arguments allow one
365
final additional chunk of data to be written immediately before closing the
366
stream. If provided, the optional `callback` function is attached as a listener
367
for the [`'finish'`][] event.
368
369
Calling the [`stream.write()`][stream-write] method after calling
370
[`stream.end()`][stream-end] will raise an error.
371
372
```js
373
// write 'hello, ' and then end with 'world!'
374
const file = fs.createWriteStream('example.txt');
375
file.write('hello, ');
376
file.end('world!');
377
// writing more now is not allowed!
378
```
379
380
##### writable.setDefaultEncoding(encoding)
381
<!-- YAML
382
added: v0.11.15
383
changes:
384
- version: v6.1.0
385
pr-url: https://github.com/nodejs/node/pull/5040
386
description: This method now returns a reference to `writable`.
387
-->
388
389
* `encoding` {string} The new default encoding
390
* Returns: `this`
391
392
The `writable.setDefaultEncoding()` method sets the default `encoding` for a
393
[Writable][] stream.
394
395
##### writable.uncork()
396
<!-- YAML
397
added: v0.11.2
398
-->
399
400
The `writable.uncork()` method flushes all data buffered since
401
[`stream.cork()`][] was called.
402
403
When using [`writable.cork()`][] and `writable.uncork()` to manage the buffering
404
of writes to a stream, it is recommended that calls to `writable.uncork()` be
405
deferred using `process.nextTick()`. Doing so allows batching of all
406
`writable.write()` calls that occur within a given Node.js event loop phase.
407
408
```js
409
stream.cork();
410
stream.write('some ');
411
stream.write('data ');
412
process.nextTick(() => stream.uncork());
413
```
414
415
If the [`writable.cork()`][] method is called multiple times on a stream, the same
416
number of calls to `writable.uncork()` must be called to flush the buffered
417
data.
418
419
```js
420
stream.cork();
421
stream.write('some ');
422
stream.cork();
423
stream.write('data ');
424
process.nextTick(() => {
425
stream.uncork();
426
// The data will not be flushed until uncork() is called a second time.
427
stream.uncork();
428
});
429
```
430
431
See also: [`writable.cork()`][].
432
433
##### writable.write(chunk[, encoding][, callback])
434
<!-- YAML
435
added: v0.9.4
436
changes:
437
- version: v6.0.0
438
pr-url: https://github.com/nodejs/node/pull/6170
439
description: Passing `null` as the `chunk` parameter will always be
440
considered invalid now, even in object mode.
441
-->
442
443
* `chunk` {string|Buffer} The data to write
444
* `encoding` {string} The encoding, if `chunk` is a String
445
* `callback` {Function} Callback for when this chunk of data is flushed
446
* Returns: {boolean} `false` if the stream wishes for the calling code to
447
wait for the `'drain'` event to be emitted before continuing to write
448
additional data; otherwise `true`.
449
450
The `writable.write()` method writes some data to the stream, and calls the
451
supplied `callback` once the data has been fully handled. If an error
452
occurs, the `callback` *may or may not* be called with the error as its
453
first argument. To reliably detect write errors, add a listener for the
454
`'error'` event.
455
456
The return value is `true` if the internal buffer is less than the
457
`highWaterMark` configured when the stream was created after admitting `chunk`.
458
If `false` is returned, further attempts to write data to the stream should
459
stop until the [`'drain'`][] event is emitted.
460
461
While a stream is not draining, calls to `write()` will buffer `chunk`, and
462
return false. Once all currently buffered chunks are drained (accepted for
463
delivery by the operating system), the `'drain'` event will be emitted.
464
It is recommended that once write() returns false, no more chunks be written
465
until the `'drain'` event is emitted. While calling `write()` on a stream that
466
is not draining is allowed, Node.js will buffer all written chunks until
467
maximum memory usage occurs, at which point it will abort unconditionally.
468
Even before it aborts, high memory usage will cause poor garbage collector
469
performance and high RSS (which is not typically released back to the system,
470
even after the memory is no longer required). Since TCP sockets may never
471
drain if the remote peer does not read the data, writing a socket that is
472
not draining may lead to a remotely exploitable vulnerability.
473
474
Writing data while the stream is not draining is particularly
475
problematic for a [Transform][], because the `Transform` streams are paused
476
by default until they are piped or an `'data'` or `'readable'` event handler
477
is added.
478
479
If the data to be written can be generated or fetched on demand, it is
480
recommended to encapsulate the logic into a [Readable][] and use
481
[`stream.pipe()`][]. However, if calling `write()` is preferred, it is
482
possible to respect backpressure and avoid memory issues using the
483
[`'drain'`][] event:
484
485
```js
486
function write (data, cb) {
487
if (!stream.write(data)) {
488
stream.once('drain', cb)
489
} else {
490
process.nextTick(cb)
491
}
492
}
493
494
// Wait for cb to be called before doing any other write.
495
write('hello', () => {
496
console.log('write completed, do more writes now')
497
})
498
```
499
500
A Writable stream in object mode will always ignore the `encoding` argument.
501
502
### Readable Streams
503
504
Readable streams are an abstraction for a *source* from which data is
505
consumed.
506
507
Examples of Readable streams include:
508
509
* [HTTP responses, on the client][http-incoming-message]
510
* [HTTP requests, on the server][http-incoming-message]
511
* [fs read streams][]
512
* [zlib streams][zlib]
513
* [crypto streams][crypto]
514
* [TCP sockets][]
515
* [child process stdout and stderr][]
516
* [`process.stdin`][]
517
518
All [Readable][] streams implement the interface defined by the
519
`stream.Readable` class.
520
521
#### Two Modes
522
523
Readable streams effectively operate in one of two modes: flowing and paused.
524
525
When in flowing mode, data is read from the underlying system automatically
526
and provided to an application as quickly as possible using events via the
527
[`EventEmitter`][] interface.
528
529
In paused mode, the [`stream.read()`][stream-read] method must be called
530
explicitly to read chunks of data from the stream.
531
532
All [Readable][] streams begin in paused mode but can be switched to flowing
533
mode in one of the following ways:
534
535
* Adding a [`'data'`][] event handler.
536
* Calling the [`stream.resume()`][stream-resume] method.
537
* Calling the [`stream.pipe()`][] method to send the data to a [Writable][].
538
539
The Readable can switch back to paused mode using one of the following:
540
541
* If there are no pipe destinations, by calling the
542
[`stream.pause()`][stream-pause] method.
543
* If there are pipe destinations, by removing any [`'data'`][] event
544
handlers, and removing all pipe destinations by calling the
545
[`stream.unpipe()`][] method.
546
547
The important concept to remember is that a Readable will not generate data
548
until a mechanism for either consuming or ignoring that data is provided. If
549
the consuming mechanism is disabled or taken away, the Readable will *attempt*
550
to stop generating the data.
551
552
*Note*: For backwards compatibility reasons, removing [`'data'`][] event
553
handlers will **not** automatically pause the stream. Also, if there are piped
554
destinations, then calling [`stream.pause()`][stream-pause] will not guarantee
555
that the stream will *remain* paused once those destinations drain and ask for
556
more data.
557
558
*Note*: If a [Readable][] is switched into flowing mode and there are no
559
consumers available to handle the data, that data will be lost. This can occur,
560
for instance, when the `readable.resume()` method is called without a listener
561
attached to the `'data'` event, or when a `'data'` event handler is removed
562
from the stream.
563
564
#### Three States
565
566
The "two modes" of operation for a Readable stream are a simplified abstraction
567
for the more complicated internal state management that is happening within the
568
Readable stream implementation.
569
570
Specifically, at any given point in time, every Readable is in one of three
571
possible states:
572
573
* `readable._readableState.flowing = null`
574
* `readable._readableState.flowing = false`
575
* `readable._readableState.flowing = true`
576
577
When `readable._readableState.flowing` is `null`, no mechanism for consuming the
578
streams data is provided so the stream will not generate its data.
579
580
Attaching a listener for the `'data'` event, calling the `readable.pipe()`
581
method, or calling the `readable.resume()` method will switch
582
`readable._readableState.flowing` to `true`, causing the Readable to begin
583
actively emitting events as data is generated.
584
585
Calling `readable.pause()`, `readable.unpipe()`, or receiving "back pressure"
586
will cause the `readable._readableState.flowing` to be set as `false`,
587
temporarily halting the flowing of events but *not* halting the generation of
588
data.
589
590
While `readable._readableState.flowing` is `false`, data may be accumulating
591
within the streams internal buffer.
592
593
#### Choose One
594
595
The Readable stream API evolved across multiple Node.js versions and provides
596
multiple methods of consuming stream data. In general, developers should choose
597
*one* of the methods of consuming data and *should never* use multiple methods
598
to consume data from a single stream.
599
600
Use of the `readable.pipe()` method is recommended for most users as it has been
601
implemented to provide the easiest way of consuming stream data. Developers that
602
require more fine-grained control over the transfer and generation of data can
603
use the [`EventEmitter`][] and `readable.pause()`/`readable.resume()` APIs.
604
605
#### Class: stream.Readable
606
<!-- YAML
607
added: v0.9.4
608
-->
609
610
<!--type=class-->
611
612
##### Event: 'close'
613
<!-- YAML
614
added: v0.9.4
615
-->
616
617
The `'close'` event is emitted when the stream and any of its underlying
618
resources (a file descriptor, for example) have been closed. The event indicates
619
that no more events will be emitted, and no further computation will occur.
620
621
Not all [Readable][] streams will emit the `'close'` event.
622
623
##### Event: 'data'
624
<!-- YAML
625
added: v0.9.4
626
-->
627
628
* `chunk` {Buffer|string|any} The chunk of data. For streams that are not
629
operating in object mode, the chunk will be either a string or `Buffer`.
630
For streams that are in object mode, the chunk can be any JavaScript value
631
other than `null`.
632
633
The `'data'` event is emitted whenever the stream is relinquishing ownership of
634
a chunk of data to a consumer. This may occur whenever the stream is switched
635
in flowing mode by calling `readable.pipe()`, `readable.resume()`, or by
636
attaching a listener callback to the `'data'` event. The `'data'` event will
637
also be emitted whenever the `readable.read()` method is called and a chunk of
638
data is available to be returned.
639
640
Attaching a `'data'` event listener to a stream that has not been explicitly
641
paused will switch the stream into flowing mode. Data will then be passed as
642
soon as it is available.
643
644
The listener callback will be passed the chunk of data as a string if a default
645
encoding has been specified for the stream using the
646
`readable.setEncoding()` method; otherwise the data will be passed as a
647
`Buffer`.
648
649
```js
650
const readable = getReadableStreamSomehow();
651
readable.on('data', (chunk) => {
652
console.log(`Received ${chunk.length} bytes of data.`);
653
});
654
```
655
656
##### Event: 'end'
657
<!-- YAML
658
added: v0.9.4
659
-->
660
661
The `'end'` event is emitted when there is no more data to be consumed from
662
the stream.
663
664
*Note*: The `'end'` event **will not be emitted** unless the data is
665
completely consumed. This can be accomplished by switching the stream into
666
flowing mode, or by calling [`stream.read()`][stream-read] repeatedly until
667
all data has been consumed.
668
669
```js
670
const readable = getReadableStreamSomehow();
671
readable.on('data', (chunk) => {
672
console.log(`Received ${chunk.length} bytes of data.`);
673
});
674
readable.on('end', () => {
675
console.log('There will be no more data.');
676
});
677
```
678
679
##### Event: 'error'
680
<!-- YAML
681
added: v0.9.4
682
-->
683
684
* {Error}
685
686
The `'error'` event may be emitted by a Readable implementation at any time.
687
Typically, this may occur if the underlying stream in unable to generate data
688
due to an underlying internal failure, or when a stream implementation attempts
689
to push an invalid chunk of data.
690
691
The listener callback will be passed a single `Error` object.
692
693
##### Event: 'readable'
694
<!-- YAML
695
added: v0.9.4
696
-->
697
698
The `'readable'` event is emitted when there is data available to be read from
699
the stream. In some cases, attaching a listener for the `'readable'` event will
700
cause some amount of data to be read into an internal buffer.
701
702
```javascript
703
const readable = getReadableStreamSomehow();
704
readable.on('readable', () => {
705
// there is some data to read now
706
});
707
```
708
The `'readable'` event will also be emitted once the end of the stream data
709
has been reached but before the `'end'` event is emitted.
710
711
Effectively, the `'readable'` event indicates that the stream has new
712
information: either new data is available or the end of the stream has been
713
reached. In the former case, [`stream.read()`][stream-read] will return the
714
available data. In the latter case, [`stream.read()`][stream-read] will return
715
`null`. For instance, in the following example, `foo.txt` is an empty file:
716
717
```js
718
const fs = require('fs');
719
const rr = fs.createReadStream('foo.txt');
720
rr.on('readable', () => {
721
console.log('readable:', rr.read());
722
});
723
rr.on('end', () => {
724
console.log('end');
725
});
726
```
727
728
The output of running this script is:
729
730
```txt
731
$ node test.js
732
readable: null
733
end
734
```
735
736
*Note*: In general, the `readable.pipe()` and `'data'` event mechanisms are
737
preferred over the use of the `'readable'` event.
738
739
##### readable.isPaused()
740
<!-- YAML
741
added: v0.11.14
742
-->
743
744
* Returns: {boolean}
745
746
The `readable.isPaused()` method returns the current operating state of the
747
Readable. This is used primarily by the mechanism that underlies the
748
`readable.pipe()` method. In most typical cases, there will be no reason to
749
use this method directly.
750
751
```js
752
const readable = new stream.Readable
753
754
readable.isPaused() // === false
755
readable.pause()
756
readable.isPaused() // === true
757
readable.resume()
758
readable.isPaused() // === false
759
```
760
761
##### readable.pause()
762
<!-- YAML
763
added: v0.9.4
764
-->
765
766
* Returns: `this`
767
768
The `readable.pause()` method will cause a stream in flowing mode to stop
769
emitting [`'data'`][] events, switching out of flowing mode. Any data that
770
becomes available will remain in the internal buffer.
771
772
```js
773
const readable = getReadableStreamSomehow();
774
readable.on('data', (chunk) => {
775
console.log(`Received ${chunk.length} bytes of data.`);
776
readable.pause();
777
console.log('There will be no additional data for 1 second.');
778
setTimeout(() => {
779
console.log('Now data will start flowing again.');
780
readable.resume();
781
}, 1000);
782
});
783
```
784
785
##### readable.pipe(destination[, options])
786
<!-- YAML
787
added: v0.9.4
788
-->
789
790
* `destination` {stream.Writable} The destination for writing data
791
* `options` {Object} Pipe options
792
* `end` {boolean} End the writer when the reader ends. Defaults to `true`.
793
794
The `readable.pipe()` method attaches a [Writable][] stream to the `readable`,
795
causing it to switch automatically into flowing mode and push all of its data
796
to the attached [Writable][]. The flow of data will be automatically managed so
797
that the destination Writable stream is not overwhelmed by a faster Readable
798
stream.
799
800
The following example pipes all of the data from the `readable` into a file
801
named `file.txt`:
802
803
```js
804
const readable = getReadableStreamSomehow();
805
const writable = fs.createWriteStream('file.txt');
806
// All the data from readable goes into 'file.txt'
807
readable.pipe(writable);
808
```
809
It is possible to attach multiple Writable streams to a single Readable stream.
810
811
The `readable.pipe()` method returns a reference to the *destination* stream
812
making it possible to set up chains of piped streams:
813
814
```js
815
const r = fs.createReadStream('file.txt');
816
const z = zlib.createGzip();
817
const w = fs.createWriteStream('file.txt.gz');
818
r.pipe(z).pipe(w);
819
```
820
821
By default, [`stream.end()`][stream-end] is called on the destination Writable
822
stream when the source Readable stream emits [`'end'`][], so that the
823
destination is no longer writable. To disable this default behavior, the `end`
824
option can be passed as `false`, causing the destination stream to remain open,
825
as illustrated in the following example:
826
827
```js
828
reader.pipe(writer, { end: false });
829
reader.on('end', () => {
830
writer.end('Goodbye\n');
831
});
832
```
833
834
One important caveat is that if the Readable stream emits an error during
835
processing, the Writable destination *is not closed* automatically. If an
836
error occurs, it will be necessary to *manually* close each stream in order
837
to prevent memory leaks.
838
839
*Note*: The [`process.stderr`][] and [`process.stdout`][] Writable streams are
840
never closed until the Node.js process exits, regardless of the specified
841
options.
842
843
##### readable.read([size])
844
<!-- YAML
845
added: v0.9.4
846
-->
847
848
* `size` {number} Optional argument to specify how much data to read.
849
* Return {string|Buffer|null}
850
851
The `readable.read()` method pulls some data out of the internal buffer and
852
returns it. If no data available to be read, `null` is returned. By default,
853
the data will be returned as a `Buffer` object unless an encoding has been
854
specified using the `readable.setEncoding()` method or the stream is operating
855
in object mode.
856
857
The optional `size` argument specifies a specific number of bytes to read. If
858
`size` bytes are not available to be read, `null` will be returned *unless*
859
the stream has ended, in which case all of the data remaining in the internal
860
buffer will be returned (*even if it exceeds `size` bytes*).
861
862
If the `size` argument is not specified, all of the data contained in the
863
internal buffer will be returned.
864
865
The `readable.read()` method should only be called on Readable streams operating
866
in paused mode. In flowing mode, `readable.read()` is called automatically until
867
the internal buffer is fully drained.
868
869
```js
870
const readable = getReadableStreamSomehow();
871
readable.on('readable', () => {
872
var chunk;
873
while (null !== (chunk = readable.read())) {
874
console.log(`Received ${chunk.length} bytes of data.`);
875
}
876
});
877
```
878
879
In general, it is recommended that developers avoid the use of the `'readable'`
880
event and the `readable.read()` method in favor of using either
881
`readable.pipe()` or the `'data'` event.
882
883
A Readable stream in object mode will always return a single item from
884
a call to [`readable.read(size)`][stream-read], regardless of the value of the
885
`size` argument.
886
887
*Note:* If the `readable.read()` method returns a chunk of data, a `'data'`
888
event will also be emitted.
889
890
*Note*: Calling [`stream.read([size])`][stream-read] after the [`'end'`][]
891
event has been emitted will return `null`. No runtime error will be raised.
892
893
##### readable.resume()
894
<!-- YAML
895
added: v0.9.4
896
-->
897
898
* Returns: `this`
899
900
The `readable.resume()` method causes an explicitly paused Readable stream to
901
resume emitting [`'data'`][] events, switching the stream into flowing mode.
902
903
The `readable.resume()` method can be used to fully consume the data from a
904
stream without actually processing any of that data as illustrated in the
905
following example:
906
907
```js
908
getReadableStreamSomehow()
909
.resume()
910
.on('end', () => {
911
console.log('Reached the end, but did not read anything.');
912
});
913
```
914
915
##### readable.setEncoding(encoding)
916
<!-- YAML
917
added: v0.9.4
918
-->
919
920
* `encoding` {string} The encoding to use.
921
* Returns: `this`
922
923
The `readable.setEncoding()` method sets the default character encoding for
924
data read from the Readable stream.
925
926
Setting an encoding causes the stream data
927
to be returned as string of the specified encoding rather than as `Buffer`
928
objects. For instance, calling `readable.setEncoding('utf8')` will cause the
929
output data will be interpreted as UTF-8 data, and passed as strings. Calling
930
`readable.setEncoding('hex')` will cause the data to be encoded in hexadecimal
931
string format.
932
933
The Readable stream will properly handle multi-byte characters delivered through
934
the stream that would otherwise become improperly decoded if simply pulled from
935
the stream as `Buffer` objects.
936
937
Encoding can be disabled by calling `readable.setEncoding(null)`. This approach
938
is useful when working with binary data or with large multi-byte strings spread
939
out over multiple chunks.
940
941
```js
942
const readable = getReadableStreamSomehow();
943
readable.setEncoding('utf8');
944
readable.on('data', (chunk) => {
945
assert.equal(typeof chunk, 'string');
946
console.log('got %d characters of string data', chunk.length);
947
});
948
```
949
950
##### readable.unpipe([destination])
951
<!-- YAML
952
added: v0.9.4
953
-->
954
955
* `destination` {stream.Writable} Optional specific stream to unpipe
956
957
The `readable.unpipe()` method detaches a Writable stream previously attached
958
using the [`stream.pipe()`][] method.
959
960
If the `destination` is not specified, then *all* pipes are detached.
961
962
If the `destination` is specified, but no pipe is set up for it, then
963
the method does nothing.
964
965
```js
966
const readable = getReadableStreamSomehow();
967
const writable = fs.createWriteStream('file.txt');
968
// All the data from readable goes into 'file.txt',
969
// but only for the first second
970
readable.pipe(writable);
971
setTimeout(() => {
972
console.log('Stop writing to file.txt');
973
readable.unpipe(writable);
974
console.log('Manually close the file stream');
975
writable.end();
976
}, 1000);
977
```
978
979
##### readable.unshift(chunk)
980
<!-- YAML
981
added: v0.9.11
982
-->
983
984
* `chunk` {Buffer|string|any} Chunk of data to unshift onto the read queue
985
986
The `readable.unshift()` method pushes a chunk of data back into the internal
987
buffer. This is useful in certain situations where a stream is being consumed by
988
code that needs to "un-consume" some amount of data that it has optimistically
989
pulled out of the source, so that the data can be passed on to some other party.
990
991
*Note*: The `stream.unshift(chunk)` method cannot be called after the
992
[`'end'`][] event has been emitted or a runtime error will be thrown.
993
994
Developers using `stream.unshift()` often should consider switching to
995
use of a [Transform][] stream instead. See the [API for Stream Implementers][]
996
section for more information.
997
998
```js
999
// Pull off a header delimited by \n\n
1000
// use unshift() if we get too much