Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.
Sign upBest pattern to prepend and append element to a series while streaming? #56
Comments
This comment has been minimized.
This comment has been minimized.
|
It would be easy with through. just make a through stream that emits all the input, but emit one special thing first and last. you may also want to look at the code for |
This comment has been minimized.
This comment has been minimized.
giacecco
commented
Apr 16, 2014
|
Thanks, I have solved the problem this way instead:
... but I can't get myself to like this. Then I bumped in other issues created by the size of the file and had to focus on other stuff :-( G. |
giacecco
closed this
Apr 16, 2014
This comment has been minimized.
This comment has been minimized.
|
that works. |
dominictarr
reopened this
Apr 17, 2014
dominictarr
closed this
Apr 17, 2014
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
giacecco commentedApr 16, 2014
I am using event-stream and request to convert a >2Gb file of records from one JSON 'schema' to another, to bulk upload to CouchDB. The code looks like:
The problem is that for the above to work I need to wrap all output in another JSON so that every original item becomes a record in an array called 'docs'. In other words, I need to prepend { "docs": [ and to append ] }, but I have no idea of how to do that. I know I could load the whole file in memory but it is not a good practice for something this big.
How can I achieve that with event-stream? Thanks,
Giacecco