New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

readArray has a problem with stream items? #52

Closed
Janpot opened this Issue Feb 17, 2014 · 9 comments

Comments

Projects
None yet
3 participants
@Janpot

Janpot commented Feb 17, 2014

Trying to work with a stream of streams:

es.readArray([
  es.readArray([1, 2, 3]),
  es.readArray([4, 5, 6]),
  es.readArray([7, 8])
]).on('data', function (stream) {
  stream.pipe(process.stdout);
});

I would expect this to print 1 through 8 but I get nothing. Could this be an issue with the library or am I thinking about this wrong?

@dominictarr

This comment has been minimized.

Owner

dominictarr commented Feb 17, 2014

oh readArray is not intended it be more than a simple way to create a stream from an array for testing. what you are looking for is a stream concatenator, or stream flattener.

try this one: https://www.npmjs.org/package/stream-stream

@Janpot

This comment has been minimized.

Janpot commented Feb 17, 2014

Well, now you bring it up, this code is part of the unit tests I'm writing for my stream flattener.

@Janpot

This comment has been minimized.

Janpot commented Feb 18, 2014

So I came to your library to save me some time not to have to create low level streaming stuff. I ended up hours chasing weird bugs when I actually started to add more and more edge case tests in my application.

Take for instance:

var es = require('event-stream');

var stream = es.through();

['a', 'b', 'c'].forEach(function (chunk) {
  stream.queue(chunk);
});
stream.queue(null);

console.log('stream still readable: %s', stream.readable);

stream.pipe(es.wait(function (error, result) {
  if (error) {
    console.log('error: %s', error.message);
  } else {
    console.log('result: %s', result);
  }
}));

I expect this to print:

stream still readable: true
result: abc

instead I get:

stream still readable: false

I ended up implementing the low level streaming stuff myself and finally have it working. I came to love node streams but this whole streams1, streams2 and streams3 stuff is so confusing.

@dominictarr

This comment has been minimized.

Owner

dominictarr commented Feb 19, 2014

you need to use stream.pause() and then stream.resume() after you pipe it.

@dominictarr

This comment has been minimized.

Owner

dominictarr commented Feb 19, 2014

or pipe it before you write the data to it.

@Janpot

This comment has been minimized.

Janpot commented Feb 19, 2014

That was indeed my first quick fix. But I'm handing over a duplex stream to the users of my library as the argument of a function. This means I have no control over when they start writing to it or when they start reading from it. With what you propose it means I have to require them to call stream.resume() whenever they are ready to use it. I wrote a proper stream that starts paused and starts streaming when _read is called for the first time.

By the way, your documentation states the following.

through takes care of pause/resume logic if you use this.queue(data) instead of this.emit('data', data).

According to that there is no reason to use pause() or resume(). It seems the bug here is that the stream is prematurely ended when queue(null) is called before .pipe.

@dominictarr

This comment has been minimized.

Owner

dominictarr commented Feb 19, 2014

Hmm, generally you don't write to a stream before it's piped - through is a very popular library that is 2 years old and no one has complained about this before - the simplest way would be to just overwrite pipe on your particular stream start of paused and resume when you call pipe. Or you can just use streams2, which is sounds like you are now.

It sounds like your are actually creating a readable stream (does the user write to your stream, or just read from it?) I wrote a different baseclass for creating that - from through was not intended for this use case, but from was.

@Janpot

This comment has been minimized.

Janpot commented Feb 19, 2014

It's a duplex stream. I'm writing a gulp plugin that parses html files and extracts build blocks that can be replaced by user provided content. The user receives these blocks in a callback. they are represented as duplex streams. read from it to get the current lines in the block. write to it to replace the content with your own. (https://github.com/Janpot/gulp-htmlbuild/blob/streaming-api/lib/builder.js). But I have it working now.

So if I get this right then this library is not stream2 compliant? Even though the 'through' README claims to be.

@dominictarr

This comment has been minimized.

Owner

dominictarr commented Feb 19, 2014

well, it's a correct streams1, it's streams2 that is backwards compatible.

@right9ctrl right9ctrl closed this Oct 13, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment