New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can I read N records at a time? #102
Comments
I think this would be fixed when I find the time to move to new stream api introduced by node 0.10.x. This isn't for tomorrow. The correct method for now is to use the transform function which, when call with a callback argument and the "parallel" option set to "1", should run sequentially. Here's how it looks like This hasn't been tested but it should work. Reopen the issue if I'm wrong. |
The example doesn't have a log() statement for the records, so as it is, it doesn't really work. I've tried various combinations, but ultimately I still can't get it to read records on demand. Here's a gist. Node makes really basic things like reading lines from a file, super complicated. In any other language, you can simply say:
Why can't we have a simple API like that? |
@wdavidw: Looks like the solution above isn't working, but I don't have issue management privileges, so I can't reopen it. Can you please check? |
I will take the time to answer in the next few days, with a tested example |
Dan, I dont understand what you mean by log in "the example doesn't have a log() statement". I've tested my example, replacing process.nextTick by setTimeout, it work as expected. You could easily replace the setTimeout by an HTTP resquest for exemple. csv()
.from("toto.csv", { columns: true })
.transform(function(record, index, callback){
console.log(record);
setTimeout(function(){
callback();
}, 1000);
}, {parallel: 1})
.on("end", function () {
console.log("done");
}); Relative to your fetch function, remember that other language are not asynchrone, that change a few things since in such case you don't need to throttle/backpressure your iteraction. Anyway, this wasn't the Node.js way util version 0.10.x. There is now a "read" function whith the new stream api which does just that but the csv parser is still based on the earlier stream specification. |
is this being maintained, this still doesn't appear to be working and the issue is still open i'm doing something similar but its not working with pipes or based on the stream example |
this issue shall be close, you'll need to either write your own implementation of stream.Transform, use an existing one (i provide stream-transform if u wish to try it). |
whats the point of doing a .transform if you aren't even providing it, it might as well be removed as functionality if it doesn't do what we expect it to or you intend by the examples or the functionality. |
the code from this issue is using a legacy api no longer supported. look at the docs, the tests or this sample so see how to use transform |
Try select-csv: It converts .csv files into an array and even into lines.
|
I'm trying to parse the records in a CSV file one by one (those records are sent to the client for display, on request). How should that be done?
I tried using
pause()
on the object returned bycsv()
, but it seemed that the buffers were large and reading individual records wasn't possible. An example would be great.A
next(N)
method would also be very useful.Here's what I've tried:
The text was updated successfully, but these errors were encountered: