Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] CSV Streams severely limited in capacity #784

Open
Komefumi opened this issue Nov 26, 2022 · 2 comments
Open

[BUG] CSV Streams severely limited in capacity #784

Komefumi opened this issue Nov 26, 2022 · 2 comments
Assignees
Labels

Comments

@Komefumi
Copy link

Describe the bug
It's found that a csv stream hits a limit in how much data it can hold.

A csv stream is found to be severely limited in the amount of data it can hold, when the need might be to hold a large amount of string content.

For instance consider this piece of code running:

await csvStream.write(stringContent);

In a loop, meant to generate the total content to create a csv file that might well have content equalling a few megabytes in size.

The cut off hits very early.

In these situations, the following has shown to work:

let csvContent = '';
csvStream.on('data',  (row) => {
  csvContent += row;
});

Having a limit to the csv stream is understandable but I couldn't find anything in regards to this in the documentation...
I think it would be sufficient if this caveat is mentioned and a solution (such as the one above) is suggested in those circumstances

@pahund
Copy link

pahund commented Aug 9, 2024

@zxramozx @doug-martin @dustinsmith1024

I can reproduce this issue. It seems it is not possible to create CSV files bigger than 64 kilobytes.

Please take a look at this demo script to reproduce: https://github.com/pahund/fast-csv-bug

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants
@doug-martin @dustinsmith1024 @pahund @Komefumi and others