-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A 19kb record throws CsvRecordTooLargeException #217
Comments
The default buffer size is 16k. You can set the
This will allow the reader to potentially process records up to 2GB. |
@MarkPflug Don't know where else to thank you. Client requested a data export of their database. Certain columns are MB big with pasted html formatted text (hence this Issue). I've been importing/exporting csv files since... SQL Server 7? and I swear all of the usual tools are either getting worse and I could not produce the output I desired. Your library successfully handles large text fields which may contain commas, line breaks, or quoted text and doesn't unnecessarily quote all of the non-string columns! I tried PS:Export-Csv, bcp, CSVHelper, FileHelpers, SSIS, SQL Exports, and on and on. From this list: fastest-net-csv-parsers I started going through libraries one by one. I didn't need a parser, just a writer. I spent 2 days trying to get this simple task done and when I switched to your library, it was done in about 2 hours. I've seen your projects for years, but never tried them. Glad I did! Thank you so much for this library. |
@picasso566 Thanks for the kind words! Glad you've found the library useful. |
Hi,
A record in my CSV is failing with CsvRecordTooLargeException
Putting the record in question in its own file tells me the record is only 19kb. If I reduce the column data in this record to under 9kb, the error disappears. My code is straight forward.
Am I running into a buffer limit per record or a column in a record? I'm not sure how to debug this further and would appreciate any assistance.
The text was updated successfully, but these errors were encountered: