-
Notifications
You must be signed in to change notification settings - Fork 93
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CSV appender for writting big files #18
Comments
Please provide a complete and executable example (main method). |
Okey, there is no need. I have detected my problem. I was using appendLine for appending more than one line. In my example, array[] has this contents:
In the end, the array could have 500 "lines of lines", when, if i'm not wrong, appendLine is for appending only something like:
Exists the way to append N lines without closing the file? We need to process line by line of a one big file, and my thought was to write this buffer (an array of 500 lines) for avoid to have all the file in memory. Thanks |
You can use appendLine() or a combination of one (or multiple) appendField() and endLine() to append data to the output. The output (e.g. the file) will be closed when closing the CsvAppender (either by explicitly calling close() or implicitly by using try-with-resources). |
Happen to me too. The strange thing happen when I append a field with a Double and not with an Integer.
|
I want to create a big csv file using the CSV Appender. I'm using this code:
being writeBuffer a List<List>. This buffer can have more than 500 lines.
When I finish with the processing, the resulting file only has 148 lines and the last one is incomplete.
I also have try to flush at 100 lines, but then is not writting the next lines.
Maybe i am using the library in a incorrect way?
Thanks in advance.
The text was updated successfully, but these errors were encountered: