-
Notifications
You must be signed in to change notification settings - Fork 29.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[stream] support piping multiple streams into a single stream #93
Comments
Ah, interesting -- it seems like the I'm curious: what is your use case for piping multiple streams into a single stream? Also, do you mind editing the title to "[stream] support piping multiple streams into a single stream"? |
For me this is a pretty common case when forking the output from a parser to separate pipelines (one per 'type' of record) and then saving to a database using a db client singelton sink. |
Another use case - piping data from multiple clients into one stream and then piping the stream back into each client, e.g. for chat rooms. |
this only works for object mode streams, and only if the order of objects is not significant |
+1 for making this explicit in the docs. Are we sure this change won't break code existing code though? Also what vkurchatkin said. |
Summoning @caineio |
This is interesting because it is going to be much more prevelelant in a situation when you have a stream hanging around and potentially never being ended. Streams are used best when the are created, handle a specific flow of data and the disposed of. But I can see a use case for persisted streams and multiple multiple sources of input for aggregation in addition to multiple outputs (which does already work) should this discussion be moved to https://github.com/isaacs/readable-stream? The complexity to this issue is great however particular if one readable-stream is pushing data and the other is expected to be pulled via '.read()' |
Nope! Here is fine.
It's definitely worth looking into what happens to |
hmm, interesting... because when a readable stream ends (via |
Another interesting edge case is that we have the functionality to have a readable stream, which can be ended but will not call For instance if we want to say 'this readable is staying open, it will continue to get data (and buffer it up) but everything, which is consuming that readable stream right now, must stop'. We can apply back pressure but that isn't really the same, we would have to manually know when that point was/is and unpipe individually. |
imho the core streams should remain as basic as possible. for merging several streams into one there are already a few modules on npm, and merging might have different strategies, like unordered merge, merge with a function or merge in pipe-order etc. |
The main problem is that streams have be created for Node's particular use cases, and each of them have different constraints. E.g. I think in general, @trevnorris idea of building low level apis, wrapping the c/c++ libs (libuv, v8, http-parser) means that we can start building abstractions on a simpler base now we know what iojs is supposed to do. The previous development has always seemed very organic. But now we have lessons learnt from where we are now, better abstractions can be built on a simpler base and inevitably make the high level api's simpler as a result. It is always tricky to re-write the underlying foundations of some software without compromising backwards compatibility. But i believe we are in a better place than ever now, with the most talented individuals to accomplish the task! |
@missinglink https://github.com/teambition/merge2 merge multiple streams into a single stream in sequence or parallel. I use it in gulp |
In my humble opinion, I understand that all streams are instances of EventEmitter. As you know this implementation is very hard to make everything work well. In order to solve this time-line problem fundamentally, you must employ Functional Reactive Programming (FRP) paradigm. What is (functional) reactive programming?
and so on. Read on the link above. Do not Node/io.js is for JavaScript, and JavaScript is a functional language.
Easy. FRP stream can be operated in any way by functional programming. Node/io.js is a back-end technology, and currently, the fundamental data structure, which is On the other hand, as a front-end technology, currently, facebook commits FRP base projects, extremely actively. React IMMUTABLE As I said, Node/io.js Also, I already have some conceptual FRP code by myself. It's very powerful with a very simple idea. Regards. |
In the following example
sink
should wait for all piped streams tounpipe()
before calling_flush
.As it currently stands the following code outputs:
$ node test1.js a done
If you remove the
// applyFix();
comment you get:$ node test1.js a b done
ref: #89
The text was updated successfully, but these errors were encountered: