Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.
Sign upreadArray has a problem with stream items? #52
Comments
This comment has been minimized.
This comment has been minimized.
|
oh readArray is not intended it be more than a simple way to create a stream from an array for testing. what you are looking for is a stream concatenator, or stream flattener. try this one: https://www.npmjs.org/package/stream-stream |
This comment has been minimized.
This comment has been minimized.
Janpot
commented
Feb 17, 2014
|
Well, now you bring it up, this code is part of the unit tests I'm writing for my stream flattener. |
This comment has been minimized.
This comment has been minimized.
Janpot
commented
Feb 18, 2014
|
So I came to your library to save me some time not to have to create low level streaming stuff. I ended up hours chasing weird bugs when I actually started to add more and more edge case tests in my application. Take for instance: var es = require('event-stream');
var stream = es.through();
['a', 'b', 'c'].forEach(function (chunk) {
stream.queue(chunk);
});
stream.queue(null);
console.log('stream still readable: %s', stream.readable);
stream.pipe(es.wait(function (error, result) {
if (error) {
console.log('error: %s', error.message);
} else {
console.log('result: %s', result);
}
}));I expect this to print:
instead I get:
I ended up implementing the low level streaming stuff myself and finally have it working. I came to love node streams but this whole |
This comment has been minimized.
This comment has been minimized.
|
you need to use |
This comment has been minimized.
This comment has been minimized.
|
or pipe it before you write the data to it. |
This comment has been minimized.
This comment has been minimized.
Janpot
commented
Feb 19, 2014
|
That was indeed my first quick fix. But I'm handing over a duplex stream to the users of my library as the argument of a function. This means I have no control over when they start writing to it or when they start reading from it. With what you propose it means I have to require them to call By the way, your documentation states the following.
According to that there is no reason to use |
This comment has been minimized.
This comment has been minimized.
|
Hmm, generally you don't write to a stream before it's piped - through is a very popular library that is 2 years old and no one has complained about this before - the simplest way would be to just overwrite pipe on your particular stream start of paused and resume when you call pipe. Or you can just use streams2, which is sounds like you are now. It sounds like your are actually creating a readable stream (does the user write to your stream, or just read from it?) I wrote a different baseclass for creating that - from |
This comment has been minimized.
This comment has been minimized.
Janpot
commented
Feb 19, 2014
|
It's a duplex stream. I'm writing a gulp plugin that parses html files and extracts build blocks that can be replaced by user provided content. The user receives these blocks in a callback. they are represented as duplex streams. read from it to get the current lines in the block. write to it to replace the content with your own. (https://github.com/Janpot/gulp-htmlbuild/blob/streaming-api/lib/builder.js). But I have it working now. So if I get this right then this library is not stream2 compliant? Even though the 'through' README claims to be. |
This comment has been minimized.
This comment has been minimized.
|
well, it's a correct streams1, it's streams2 that is backwards compatible. |
Janpot commentedFeb 17, 2014
Trying to work with a stream of streams:
I would expect this to print 1 through 8 but I get nothing. Could this be an issue with the library or am I thinking about this wrong?