Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

is buffering working while reading large csv-file (on Windows?) #16

Open
mkiessling89 opened this issue May 8, 2017 · 0 comments
Open

Comments

@mkiessling89
Copy link

Hi,
how is the buffer mechanism intended? Is it intended in terms of chunks that are buffered and the reading is continued, of is is actually the maximum file size that can be read?

I noticed two things:
a) the reading stopped in-between reading a larger file (>30.000 lines), with-out any hint that EOF was not reached. Using "a-csv" via npm module "read-csv-json" --> ... --> a-csv", (on Windows 7)

b) the CLI tool "a-csv" seems to ignore the "-l" parameter. changing it does not make a difference.

c) after increasing the "default buffer" size to 256+1024 I was able to read to my large file, but of cause on larger files that might fail again. Therefore the question, how the "buffer" usage is indented.

Thank you very much for you great tool!
Best Regards
Michael

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant