Giter Club home page Giter Club logo

readable-stream's Introduction

readable-stream

Node.js core streams for userland

npm status node Node.js Build Browsers Build

npm install readable-stream

This package is a mirror of the streams implementations in Node.js 18.19.0.

Full documentation may be found on the Node.js website.

If you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use readable-stream only and avoid the "stream" module in Node-core, for background see this blogpost.

As of version 2.0.0 readable-stream uses semantic versioning.

Version 4.x.x

v4.x.x of readable-stream is a cut from Node 18. This version supports Node 12, 14, 16 and 18, as well as evergreen browsers. The breaking changes introduced by v4 are composed of the combined breaking changes in:

This also includes many new features.

Version 3.x.x

v3.x.x of readable-stream is a cut from Node 10. This version supports Node 6, 8, and 10, as well as evergreen browsers, IE 11 and latest Safari. The breaking changes introduced by v3 are composed by the combined breaking changes in Node v9 and Node v10, as follows:

  1. Error codes: nodejs/node#13310, nodejs/node#13291, nodejs/node#16589, nodejs/node#15042, nodejs/node#15665, #344
  2. 'readable' have precedence over flowing nodejs/node#18994
  3. make virtual methods errors consistent nodejs/node#18813
  4. updated streams error handling nodejs/node#18438
  5. writable.end should return this. nodejs/node#18780
  6. readable continues to read when push('') nodejs/node#18211
  7. add custom inspect to BufferList nodejs/node#17907
  8. always defer 'readable' with nextTick nodejs/node#17979

Version 2.x.x

v2.x.x of readable-stream is a cut of the stream module from Node 8 (there have been no semver-major changes from Node 4 to 8). This version supports all Node.js versions from 0.8, as well as evergreen browsers and IE 10 & 11.

Usage

You can swap your require('stream') with require('readable-stream') without any changes, if you are just using one of the main classes and functions.

const {
  Readable,
  Writable,
  Transform,
  Duplex,
  pipeline,
  finished
} = require('readable-stream')

Note that require('stream') will return Stream, while require('readable-stream') will return Readable. We discourage using whatever is exported directly, but rather use one of the properties as shown in the example above.

Usage In Browsers

You will need a bundler like browserify, webpack, parcel or similar. Polyfills are no longer required since version 4.2.0.

Streams Working Group

readable-stream is maintained by the Streams Working Group, which oversees the development and maintenance of the Streams API within Node.js. The responsibilities of the Streams Working Group include:

  • Addressing stream issues on the Node.js issue tracker.
  • Authoring and editing stream documentation within the Node.js project.
  • Reviewing changes to stream subclasses within the Node.js project.
  • Redirecting changes to streams from the Node.js project to this project.
  • Assisting in the implementation of stream providers within Node.js.
  • Recommending versions of readable-stream to be included in Node.js.
  • Messaging about the future of streams to give the community advance notice of changes.

Team Members

readable-stream's People

Contributors

bergos avatar bnb avatar calvinmetcalf avatar cclauss avatar chrisdickinson avatar deoxxa avatar isaacs avatar jeswr avatar jmm avatar mafintosh avatar mcollina avatar ofrobots avatar rangermauve avatar raynos avatar realityking avatar rishabhkodes avatar ronag avatar rvagg avatar rwaldron avatar shinnn avatar shogunpanda avatar sonewman avatar stevemao avatar styfle avatar timgestson avatar tootallnate avatar trxcllnt avatar varunsh-coder avatar vweevers avatar xhmikosr avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

readable-stream's Issues

unpipe and on("data"

Imagine the following

var stream = Nice()

stream.on("data", function () {
    stream.close()
})

function Nice() {
    var readable = createReadStream(...)
        , writable = createWriteStream(...)

    readable.pipe(writable)

    readable.close = function () {
        readable.unpipe(writable)
        readable.emit("end")
    }
}

If a user turns my streams2 module into "classic stream" mode by adding a data listener.

Should the new unpipe method work as expected?

Problems with node v0.8.x + readable-stream v1.1.x + level-sublevel (latest)

I'm not exactly sure if this is a readable-stream issue or not, but bear with me.

levelup recently upgraded to readable-stream v1.1 and while everything seems hunky-dory on node v0.10.22, suddenly level-sublevel has started failing on node v0.8.26

I've narrowed it down to this line here: https://github.com/dominictarr/level-sublevel/blob/c3f2a5e554a422d5887e7f43842dfd821042020f/sub.js#L165

On node v0.10.22 a truthy val comes through so the if branch is executed, as it also is on readable-stream v1.0.x (regardless of node version) - but in node v0.8.26 w/ readable-stream 1.1.x there are no truthy vals at all.

Now unfortunately I don't know enough about streams to know whether this is a readable-stream issue, or whether it's just level-sublevel being a bit too tightly coupled with the old streams implementation.

But I thought I'd post this just in case.

Also issues filed here and here

Potential crash in flow() function

Hey Isaac, I was reading through readable.js just to get an idea of what is to come in the future of readable streams.

I think there may be a crashing bug in the flow function, but maybe I am misinterpreting the code.

The only time that the function flow is called, it is bound to null or the GLOBAL object.
(Lines: #251 in pipe(); #265, #299, in flow()).

In flow you call functions on this, and I think that this should be switched to src.

  if (this.listeners('data').length)
    emitDataEvents(this);

should become

  if (src.listeners('data').length )
    emitDataEvents(src);

Sorry if this is wrong, I haven't ran any tests against it, just something that caught my eye.

PassThrough streams (but more importantly Transform streams) never emit an "end" event

test case:

var fs = require('fs');
var assert = require('assert');
var PassThrough = require('readable-stream/passthrough');

var p = new PassThrough();
var f = fs.createReadStream(__filename);
var pEnd = false;
var fEnd = false;
f.on('end', function () {
  console.error('f "end" event');
  fEnd = true;
});
p.on('end', function () {
  console.error('p "end" event');
  pEnd = true;
});
process.on('exit', function () {
  assert(fEnd, 'no fs.ReadStream "end" event');
  assert(pEnd, 'no Passthrough "end" event');
});
f.pipe(p);

Suggestion: Event names should be symmetric

If the event that signals you can read more data is 'readable', perhaps the event that signals you can write more data should be 'writable'. 'drain' doesn't sound like the counterpart for 'readable' and will therefore be harder to remember.

is 'end' event necessary?

here is an idea: remove the end event, instead just set this.ended = true

and then, in the pipe loop

function flow() {
  var chunk;
  var dest;
  while ((dest = this._dest) && (chunk = this.read())) {
    var written = dest.write(chunk);
    if (false === written && this._dest) {
      this._dest.once('drain', flow.bind(this));
      return;
    }
  }
  if(this.ended && this._pipeOpt.end !== false)
    return dest.end();
  this.once('readable', flow);
}

I feel this maybe comes out a little bit clearer, and is more consistent with the vibe of this style.
you may still want an end event, but maybe propagate the end of the stream like this.

read/pause "deadlock"

Its possible for a series of reads to not consume the entire buffer, but the last read to ask for more than what is left. This causes the last read to return null (signaling to wait for another 'readable' event) and the wrapped stream to pause (preventing another 'readable' event).

Interestingly, the program seems to exit early (with a 0 no less) once this happens. I haven't figured that part out yet.

This gist should demonstrate the issue:

https://gist.github.com/4316203

It seems like not pausing if needReadable === true could be the right thing to do, but I'm not very familiar with this code, so I don't know what other implications there are. I tried the test with it, and the code that prompted this investigation and they both work as I expect now. My change is here:

https://github.com/dannycoates/readable-stream/commit/e7482f1697136cea4e693978c87aef9202c9787a

Thanks ๐Ÿ˜ธ

Async _read() documentation problem?

I'm a bit confused by the manner in which we'd overwrite the _read method on the Readable to be async.

From _stream_readable.js line 478

// abstract method.  to be overridden in specific implementation classes.
// call cb(er, data) where data is <= n in length.
// for virtual (non-string, non-buffer) streams, "length" is somewhat
// arbitrary, and perhaps not very meaningful.
Readable.prototype._read = function(n) {
  this.emit('error', new Error('not implemented'));
};

But _read is only receiving n, no reference to a cb. Is there an option I might need to set somewhere to trigger it into an async mode? Or am I just completely missing something. Finding it hard to come up with any info on async read streams.

function api

var read = new Readable(function _read() { ... })
    , write = new Writable(function _write() { ... })
    , transform = new Transform(function _transform() { ... })

Or alternatively

var read = Stream.createReadable(_read)
    , write = Stream.createWritable(_write)
    , transform = Stream.createTransform(_transform)

These APIs are more in line with node core.

Export other types of stream

The readable-stream npm module currently only exports the Readable object,
Duplex, Transform, Writable, PassThrough and the like are not available.

Many consumers of one stream

It's more a question than an issue.

I have impression that one stream (as it's in curent/old v0.8 API) can be consumed by two separate listener logics, even those not aware of each other (just by attaching listeners to data and end events), and it makes sense to me (but maybe I'm missing something).

New API is based on different pull model, and I think it would no longer be possible for one stream to be consumed by many, am I right? It seems a bit limiting, at least I have use cases on my side where two independent parts of logic are fed with same stream.

What am I missing?

pipe broken

src.pipe(dest)

If dest emits close or error or finish the pipe is not cleaned up.

Meaning the src will continue writing to a broken stream

reading is too verbose

0.8

stream.on("data", function (chunk) { ... })

0.10

consume(stream, function (chunk) { ... })

function consume(stream, consumer) {
    flow()

    stream.on("readable", flow)

    function flow() {
        var chunk = stream.read()
        while (chunk !== null) {
            consumer(chunk)
            chunk = stream.read()
        }
    }
}

There is too much boilerplate when it comes to just consuming all of the data.

The alternative is to use pipe. Which is still verbose

var WriteStream = require("write-stream")

stream.pipe(WriteStream(function (chunk) { ... }))

Not to mention you have to create an entire new writable stream and invoke the pipe algorithm which is heavy.

Improved api

stream.read(function (chunk) { ... })

When .read() is called with a callback instead of a bytes count it should run the consume function above or something similar to it.

The main use-case is to make streams2 readable streams friendly to use.

For example

var http = require("http")

http.createServer(function (req, res) {
    var body = ""

    req.read(function (chunk) {
        body += chunk
    })

    req.on("end", function () {
        var data = JSON.parse(body)
        // do stuff
    })
})

push or pull?

The core idea of this module is to make streaming a pull operation, rather than a push operation.
However, the current implementation of pipe (specifically, the internal function flow) mungs it back into a push operation.

Consider the metaphor of physical pipes carrying water. Naively you can get the water to move my increasing the pressure at one end, like a garden hose, or by decreasing the pressure at the other end - like with a drinking straw.

You could consider the old stream api to be a "SquirtStream" -- it is the pressure at the readable end that starts pushing data through the pipeline, and I think in spirit, I think the new interface wants to be a "SuckStream" - it is the reader that pulls data through.

Currently, ReadableStream#pipe is in a weird middle ground.

When piped, each segment is forced to accept data until it return false.
i.e. piping causes each individual segment to start pulling.
This is not completely unreasonable, after all, that is how your esophagus and your veins work.
So, this is like a "SwallowStream".

so what I propose, is allow the writable stream to implement flow. (one could default to dest.flow || flow of course)
then, it would be possible to construct lazy pipelines where source.read() is never called until dest.read() is called

readable.pipe(through).pipe(through2)

readable.read() is never called until through2.read() is called.

Currently, one chunk is read from readable, but through.write(data)===false
then that chunk is drained when through2 reads, so through reads again.
so readable has read two chunks, when actually neither of those reads was necessary yet.

indeed, no reads should be performed until through2.pipe(writeable)

But what if it could work more like this:

Through.prototype.read = function () {
  return this._source.read()
}

Also, it would enable writable streams to make use of the read(bytes) api, and read only 4 bytes, for example,
while keeping the composeable api of a pipable stream!

I'm not sure exactly how this would best be implemented, but do I feel that a SuckStream is much more conceptually simple than a SwallowStream.

Multiply pipes

As I understand the code and the main princip in .read(bytes). A ReadStream can only be piped to a single location. This makes the new API limiting compared to the old one.

Example:

var c = fork('child.js', {silent: true}); 

c.stdout.pipe(process.stdout); // no problem
c.stdout.pipe(fs.createWriteStream('./log.txt')); // the old pipe is broken

flow on nextTick

if pipe called flow on process.nextTick then we can pipe a single source to multiple destinations without worrying about whether the first pipe will empty the current buffer of that readable stream (i.e. run it's while loop and consume all the chunks of data through source.read)

pipe is destructive

The previous pipe did added a non destructive event listener.

The current pipe destructively pulls out a lot of chunks from a readable stream.

This is a significant change of semantics and will probably break things.

This can again be migated in part by making pipe call flow on process.nextTick

Pipe - end race condition

Consider the following

var readableThing = createReadable()

readableThing.pipe(someWritable)

readableThing.pushOntoInternalBuffer(one)
readableThing.pushOntoInternalBuffer(two)
readableThing.emit("readable")
readableThing.end()

Here we have a readable stream piping into a writable. Imagine the writable returned false so that the pipe has to wait on drain.

Then the readable thing has ended so it emits end and calls dest.end() in pipe.

later the writable destination emits the drain and the pipe flow picks back up and and starts writing the second thing (two) to the destination.

This means we have a write onto an ended writable stream

pipe followed by on data

stream.pipe(otherStream)

stream.on("data", callback)

In this situation we have a while read loop in the first pipe.

Then because we added a data listener we have a second while read loop in the on readable callback.

This sounds like the on("data" hack should use pipe internally to have the correct multiple destination semantics.

transform end

var t = new Transform()
t._transform = function (chunk, output, cb) {
    // pretend chunk is a filename
    fs.readFile(chunk, cb)
}
thing.pipe(t).pipe(otherThing)

It would be nice if end had the (err, data) signature so that it's compatible with any callback.

The motivation is to make it easy to use asynchronous callbacks as your transformation.

_flowing is never set to true

Either get rid of _flowing or set it to true when it is

Currently multiple pipes in the same tick set up multiple calls to flow on the next tick.

All tests fail

While almost all tests passed in v1.0.17, no test pass on the latest version.

The tests that existed in 1.0.17, don't pass because of b88f50c, where the EE.listenerCount method definition was removed and the include paths modified.

More tests were added in bf2f69a, where setImmediate isn't defined, require('stream') is used instead of a local require call.

Is any fix on the roadmap?

readable cleanup

I have issues with there not being automatic readable stream cleanup, like a method or event I can hook into when the writable a readable is piped to ends.

Steps to reproduce:

leaks.js:

var http = require('http');
var Readable = require('readable-stream').Readable;

http.createServer(function(req, res){

  var stream = Readable();

  stream._read = function(){
    console.log('_read', Date.now());
    setTimeout(function(){
      stream.push(Date.now()+'\n');
    }, 100);
  };

  stream.pipe(res);

}).listen(3000);

fixed.js:

var http = require('http');
var Readable = require('readable-stream').Readable;

http.createServer(function(req, res){

  var stream = Readable();

  stream._read = function(){
    console.log('_read', Date.now());
    if (stream.closed) return stream.push(null);
    setTimeout(function(){
      stream.push(Date.now()+'\n');
    }, 100);
  };

  stream.close = function(){
    console.log('close');
    stream.closed = true;
  };

  req.on('close', function(){
    stream.close();
  });

  stream.pipe(res);

}).listen(3000);
  1. Run node leaks.js
  2. Issue a request to it, for example via curl http://localhost:3000
  3. Watch leaks.js repeatedly print _read <Date> to the console while the request is active
  4. Kill the request by killing curl
  5. leaks.js still keeps printing _read <Date>. The readable stream is still being read although the request ended

Fixed in fixed.js by implementing and calling a custom closing mechanism. The fact that we have to call the .close() manually means that if you just fs.createReadStream("/dev/random").pipe(res) you create a leak too.

warning: possible EventEmitter memory leak detected

I am trying to write what is essentially a tool that concats multiple streams into a single readable stream. I tried to use a PassThrough stream for this, where I simply inputStreamN.pipe(passthroughStream, {end:false}) the input streams one after the other into the PassThrough stream.

This works as intended, however after a few input streams are processed, I keep getting EventEmitter memory leak warnings.

As far as I can tell this is caused in the Readable class which, when unpiped, is not removing listeners for anything but the 'drain' event.

Transform stream: Write data to socket

Hello,

I am working on a Websocket implementation (mainly of learn use) which core component handles receiving and sending WebSocket frames. To de- and encode incoming frames I now wanted to write a Stream which is attached to the socket object which is provided with an upgrade event of a httpServer.

The reading (decoding) part until here is actually straight forward. Pipe the data from the socket object into my decoder stream and you get the ready to use data in string form.

What currently breaks my head is the writing part. In the writing part I have to encode a string into a byte frame with some headers. This task itself is also quite easy the main issue I fight with is that I do not find the right API to do this. I cannot easily just pipe the write events to the connection socket. I always have to keep a reference to the socket object in the stream so I can write directly onto that. This does not feel right at all.

Why can't I handle outgoing writes in the same way as incoming (data events)?

Example how I would like to use a stream:

// overwrite this method to transform incoming data (data which getting read by listeners)
WebSocketParser._transformIncoming = function(chunk, outputFn, callback) {
    var frame = new Frame(chunk);

    var data = frame.getDecodedPayload();

    if (frame.getOpcode() == 0x1) {
        data = data.toString();
    }

    outputFn(data);
    callback();
};

// overwrite this method to transform outgoing data (data which is sent back to the client)
WebSocketParser._transformOutgoing = function(data, outputFn, callback) {
    var frame = new Frame();

    frame.setFinal(true);
    frame.setMasked(false);

    if (typeof data == 'string') {
        frame.setPayload(new Buffer(data));
    } else {
        frame.setPayload(data);
    }

    outputFn(frame.toBuffer());
    callback();
};

var parser = new WebSocketParser();

parser.on('data', function(message) {
    console.log('received message: ', message);

    parser.write('hello world');
});

httpServer.on('upgrade', function(req, socket, head) {
    upgradeHandler(req, socket, function(socket) {
        // this will pipe all operations read and write
        socket.pipe(parser);
    });
});

Would this be an improvement or am I just missunderstanding streams?

Current implementation:
https://github.com/bodokaiser/websockets/blob/master/lib/stream.js

Kind Regards,
Bodo Kaiser

Keep in sync with current Node stable release

Would it be possible to have some kind of auto (or each) sync mechanism so that this lib could be kept in sync with the current Node stable release version of the same?

I've been using 'readable-stream' in my libs so I can maintain 0.8 compatibility but I've just been alerted to the fact that it's not kept up to date (I didn't know this) which means an ugly require('stream').X check and anyone on 0.8 misses out on any fixes & updates.

Readable doesn't "look like" old-style streams

If I understand correctly, the goal is that Readable should be completely backward-compatible with existing code that expects an "old-style" readable-stream.

The problem is that: as currently implemented myReadable.readable !== true, until after a data event is added (or pause() / resume() is called).

In contrast, old-style readable-streams set stream.readable = true from the beginning, in order to advertise that the object implements the readable-stream interface; and they set stream.readable = false when the end event arrives.

Some existing code, uses the stream.readable property to verify that a given instance is, in fact, a non-ended readable-stream, before attempting to add a data listener to it. In other words, the following code is not unheard-of:

function doSomething(stream) {
  if (!stream.readable) {
    throw new Error("The provided object is not a readable stream");
  }
  stream.on('data', function(chunk) { ... }
}

of course, code like this won't work with an instance of the new Readable, because stream.readable == undefined, until after the data event is added, but the check is done, and the exception is thrown before the event is added.

here is an example of this "in the wild" (aheckmann/gm)

`read` should allow returning falsy values

var s = new Readable()
    , list = [0,1,2,3]

s.read = function(bytes) {
    var chunk = list.shift()
    if (chunk === undefined) {
        s.emit("end")
        return null
    }
    return chunk
}

s.pipe(stringer).pipe(process.stdout)

Won't work as s returns 0 as one of the read return values. The flow of readable-stream should check explicitly for the value null and not any falsey values

crash: TypeError: Cannot set property 'decodeStrings' of undefined

var assert = require('assert');
var http = require('http');
var net = require('net');
var stream = require('readable-stream');

var server = http.createServer(function (req, res) {
  res.end('ok');
}).listen(0, 'localhost', function () {
  var client = net.connect(server.address().port);
  client.write(
    "POST / HTTP/1.1\r\n" +
    "Content-Length: 70\r\n" +
    "Content-Type: multipart/form-data; boundary=foo\r\n\r\n");
  client.end();
});
$ node test.js
net.js:177
  this._writableState.decodeStrings = false;
                                    ^
TypeError: Cannot set property 'decodeStrings' of undefined
    at new Socket (net.js:177:37)
    at Object.exports.connect.exports.createConnection (net.js:93:11)
    at Server.<anonymous> (/home/andy/tmp/npmtest/testmocha.js:9:20)
    at Server.g (events.js:175:14)
    at Server.EventEmitter.emit (events.js:92:17)
    at net.js:1052:10
    at process._tickCallback (node.js:415:13)
$ node -v
v0.10.18

See also mochajs/mocha#969

This bug is introduced by version 1.1.8. It does not exist in 1.1.7.

Standardized `close`

close on a readable stream means "I do not want to read anymore from you. please terminate"

close should be followed by emptying the entire internal buffer and then emitting "end"

This gives users control of knowing they can tell a stream to clean up it's internal buffer and any other resources. It also means a user can tell the stream to clean up any related resources or connections or file descriptors.

piping into a finished writer

var dest = Writable()

source.pipe(dest)

// dest finishes

// Later
source2.pipe(dest)

Currently it will call write which throws a write after end error.

Are there any valid edge cases where this can happen?

Is there any way to know whether a dest has finished so that you can not pipe.

NPM

0.0.4 needs to be on npm.

please publish it.

resume / pause

stream.pause()

Fails if the the stream is a Readable stream

Checksum error ?

Hi,

I have a checsum error while installing yeoman

0 info it worked if it ends with ok
1 verbose cli [ 'C:\\Program Files\\nodejs\\\\node.exe',
1 verbose cli   'C:\\Program Files\\nodejs\\node_modules\\npm\\bin\\npm-cli.js',
1 verbose cli   'install',
1 verbose cli   'readable-stream' ]
2 info using npm@1.3.24
3 info using node@v0.10.25
4 verbose node symlink C:\Program Files\nodejs\\node.exe
5 verbose cache add [ 'readable-stream', null ]
6 verbose cache add name=undefined spec="readable-stream" args=["readable-stream",null]
7 verbose parsed url { protocol: null,
7 verbose parsed url   slashes: null,
7 verbose parsed url   auth: null,
7 verbose parsed url   host: null,
7 verbose parsed url   port: null,
7 verbose parsed url   hostname: null,
7 verbose parsed url   hash: null,
7 verbose parsed url   search: null,
7 verbose parsed url   query: null,
7 verbose parsed url   pathname: 'readable-stream',
7 verbose parsed url   path: 'readable-stream',
7 verbose parsed url   href: 'readable-stream' }
8 silly lockFile c476f42d-readable-stream readable-stream
9 verbose lock readable-stream C:\Users\s586883\AppData\Roaming\npm-cache\c476f42d-readable-stream.lock
10 silly lockFile c476f42d-readable-stream readable-stream
11 silly lockFile c476f42d-readable-stream readable-stream
12 verbose addNamed [ 'readable-stream', '' ]
13 verbose addNamed [ null, '*' ]
14 silly lockFile acade489-readable-stream readable-stream@
15 verbose lock readable-stream@ C:\Users\s586883\AppData\Roaming\npm-cache\acade489-readable-stream.lock
16 silly addNameRange { name: 'readable-stream', range: '*', hasData: false }
17 verbose url raw readable-stream
18 verbose url resolving [ 'http://registry.npmjs.org/', './readable-stream' ]
19 verbose url resolved http://registry.npmjs.org/readable-stream
20 info trying registry request attempt 1 at 18:02:12
21 verbose etag "DAE27EQLX3F3INP2VQNGUIHRZ"
22 http GET http://registry.npmjs.org/readable-stream
23 http 304 http://registry.npmjs.org/readable-stream
24 silly registry.get cb [ 304,
24 silly registry.get   { date: 'Wed, 05 Feb 2014 17:02:13 GMT',
24 silly registry.get     'last-modified': 'Wed, 05 Feb 2014 17:02:13 GMT',
24 silly registry.get     'cache-control': 'max-age=1',
24 silly registry.get     etag: '"DAE27EQLX3F3INP2VQNGUIHRZ"',
24 silly registry.get     'x-served-by': 'cache-fra1229-FRA',
24 silly registry.get     'x-cache': 'HIT',
24 silly registry.get     'x-cache-hits': '1',
24 silly registry.get     'x-timer': 'S1391619733.433038235,VS0,VE93',
24 silly registry.get     vary: 'Accept',
24 silly registry.get     via: '1.1 varnish, 1.1 prafaavadm1.axa-fr.intraxa:8080 (IronPort-WSA/7.1.3-013), 1.1 prafapxyadm1.siege.axa-fr.intraxa:8080 (IronPort-WSA/7.1.3-013)',
24 silly registry.get     connection: 'keep-alive',
24 silly registry.get     'proxy-connection': 'keep-alive' } ]
25 verbose etag readable-stream from cache
26 silly addNameRange number 2 { name: 'readable-stream', range: '*', hasData: true }
27 silly addNameRange versions [ 'readable-stream',
27 silly addNameRange   [ '0.0.1',
27 silly addNameRange     '0.0.2',
27 silly addNameRange     '0.0.3',
27 silly addNameRange     '0.0.4',
27 silly addNameRange     '0.1.0',
27 silly addNameRange     '0.2.0',
27 silly addNameRange     '0.3.0',
27 silly addNameRange     '0.3.1',
27 silly addNameRange     '1.0.0',
27 silly addNameRange     '1.0.1',
27 silly addNameRange     '1.0.2',
27 silly addNameRange     '1.0.15',
27 silly addNameRange     '1.0.17',
27 silly addNameRange     '1.1.7',
27 silly addNameRange     '1.1.8',
27 silly addNameRange     '1.1.9',
27 silly addNameRange     '1.0.24',
27 silly addNameRange     '1.1.10',
27 silly addNameRange     '1.0.25',
27 silly addNameRange     '1.0.25-1' ] ]
28 verbose addNamed [ 'readable-stream', '1.1.10' ]
29 verbose addNamed [ '1.1.10', '1.1.10' ]
30 silly lockFile 987e367e-readable-stream-1-1-10 readable-stream@1.1.10
31 verbose lock readable-stream@1.1.10 C:\Users\s586883\AppData\Roaming\npm-cache\987e367e-readable-stream-1-1-10.lock
32 silly lockFile 7cfb0470-tream-readable-stream-1-1-10-tgz http://registry.npmjs.eu/readable-stream/-/readable-stream-1.1.10.tgz
33 verbose lock http://registry.npmjs.eu/readable-stream/-/readable-stream-1.1.10.tgz C:\Users\s586883\AppData\Roaming\npm-cache\7cfb0470-tream-readable-stream-1-1-10-tgz.lock
34 verbose addRemoteTarball [ 'http://registry.npmjs.eu/readable-stream/-/readable-stream-1.1.10.tgz',
34 verbose addRemoteTarball   'd4dc2e5319e9c90d1e71c69390ef62cd90827f65' ]
35 info retry fetch attempt 1 at 18:02:13
36 verbose fetch to= C:\Users\s586883\AppData\Local\Temp\npm-3480-ZN9xeuP2\1391619733277-0.5848892508074641\tmp.tgz
37 http GET http://registry.npmjs.eu/readable-stream/-/readable-stream-1.1.10.tgz
38 http 200 http://registry.npmjs.eu/readable-stream/-/readable-stream-1.1.10.tgz
39 silly lockFile 7cfb0470-tream-readable-stream-1-1-10-tgz http://registry.npmjs.eu/readable-stream/-/readable-stream-1.1.10.tgz
40 silly lockFile 7cfb0470-tream-readable-stream-1-1-10-tgz http://registry.npmjs.eu/readable-stream/-/readable-stream-1.1.10.tgz
41 silly lockFile 987e367e-readable-stream-1-1-10 [email protected]
42 silly lockFile 987e367e-readable-stream-1-1-10 [email protected]
43 silly lockFile acade489-readable-stream readable-stream@
44 silly lockFile acade489-readable-stream readable-stream@
45 error Error: shasum check failed for C:\Users\s586883\AppData\Local\Temp\npm-3480-ZN9xeuP2\1391619733277-0.5848892508074641\tmp.tgz
45 error Expected: d4dc2e5319e9c90d1e71c69390ef62cd90827f65
45 error Actual:   d40a009ef90e795283653b2fb8e601a26d83113b
45 error From:     http://registry.npmjs.eu/readable-stream/-/readable-stream-1.1.10.tgz
45 error     at C:\Program Files\nodejs\node_modules\npm\node_modules\sha\index.js:38:8
45 error     at ReadStream.<anonymous> (C:\Program Files\nodejs\node_modules\npm\node_modules\sha\index.js:85:7)
45 error     at ReadStream.EventEmitter.emit (events.js:117:20)
45 error     at _stream_readable.js:920:16
45 error     at process._tickCallback (node.js:415:13)
46 error If you need help, you may report this *entire* log,
46 error including the npm and node versions, at:
46 error     <http://github.com/isaacs/npm/issues>
47 error System Windows_NT 6.1.7601
48 error command "C:\\Program Files\\nodejs\\\\node.exe" "C:\\Program Files\\nodejs\\node_modules\\npm\\bin\\npm-cli.js" "install" "readable-stream"
49 error cwd Z:\jhipster
50 error node -v v0.10.25
51 error npm -v 1.3.24
52 verbose exit [ 1, true ]

transformation streams ending early

source
    .pipe(Transform(function (chunk, value, callback) {
        if (someCondition) {
            // I decide I don't need to read the rest of the stream
            this.end()
        }

        ...
    })
    .pipe(dest)

This has a few issues.

One you can't call end multiple times savely in Writable. This is easier to handle because you can just do an if (someCondition && !ended) {

Two the source will eventually call end anyway which is very ugly to handle. You could do source.unpipe(this) but that requires the Transform stream to have access to the source which can't be done cleanly if it is a module without passing the source in which feels silly because multiple sources can pipe through it, how do you know which one to end?

It could listen to the pipe event and keep track itself and unpipe everything.

Three if a destination stream unpipes itself it needs to keep track of it's drain status and emit drain to avoid the flow loop being stuck forever.

Four should we somehow make this use case easier or is it a userland module?

Making fs.createReadStream work in 0.8 and 0.10

I have some code that works in 0.10. I create a stream with fs.createReadStream and then assign handlers for 'readable', 'end' and 'error' events.

I was under the impression that readable-stream would let me make my code be backwards compatible with 0.8, but I am unsure how I use it.

Here is a small example

var fs = require('fs');

var stream = fs.createReadStream('test.txt');

stream.on('readable', function () {
    console.log('The stream became readable');

    var chunk = stream.read(10);

    while (chunk !== null) {
        console.log('Read the next 10 bytes of the file. They are "' + chunk.toString() + '"');
        chunk = stream.read(10);
    }
});

stream.on('error', function () {
    console.log('There was an error');
});

stream.on('end', function () {
    console.log('The stream ended');
});

This code works in 0.10 but not 0.8, as expected. If I replace

var fs = require('fs');

with

var fs = require('readable-stream/fs');

I get the same error in both 0.8 and 0.10:

/my/project/path/node_modules/readable-stream/fs.js:1510
    cb(null, b);
    ^
TypeError: undefined is not a function
    at onread (/my/project/path/node_modules/readable-stream/fs.js:1510:5)
    at Object.wrapper [as oncomplete] (/my/project/path/node_modules/readable-stream/fs.js:420:17)

Is there any documentation that explains what I should be doing here?

emitting end inside read

var s = new Readable()
s.read = function () {
    if (thing) { return value }
    this.emit("end")
}

Are streams allowed to emit end inside the read function?

Does pipe handle this elegantly?

Reading until terminator makes for a bad day

Doing something equivalent to reading a null terminated string is horrifyingly hard to do right now if you want to read after it. Otherwise replays must be used or intermediate streams for buffering etc.

Imagine a structure with a null terminated string followed by a single byte:

Currently (holy crap allocations / problems):

var str = new Buffer();
//
// Must read one at a time so that read(x) works after the null
// without doing screwy replays
//
while (c = rstream.read(1) && c[0] !== 0x00) {
  str = Buffer.concat(str, c);
}
//
// inclusive i guess?
//
str = Buffer.concat(str, new Buffer([0]);
//
// Now read(x) still works right without replays!
//
rstream.read(1);

A nice to have would be:

rstream.readUntil(0x00);
rstream.read(1);

Usecase: How to create readable stream for redis subscribe?

I was looking at the new readable stream documentation and it said that we've to implement _read() method in order to implement the readable stream.

But I have a problem?

I want to create a readable stream where it's source is redis subscribe(). How can I implement a readable stream for this with the new API?

Do I need to buffer incoming data from redis?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.