-
Notifications
You must be signed in to change notification settings - Fork 581
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[encoding/csv] Make the columns option more permissive while parsing #3225
Comments
According to the spec https://csv-spec.org/ (4th point)
|
I don't expect JSON.parse to parse invalid json and the same is with csv parser, my opinion is that the user should preprocess the file to make it a valid csv before trying to parse it |
I agree with @sigmaSd. Throwing an error when trying to parse an invalid string is the correct and expected behavior. |
Yeah, I'm with @sigmaSd and @timreichen This isn't a good and proper behavior when parsing files, if any data is invalid the package should throw an error to warn the user that the data type is invalid. This treatment should be done before any parsing. |
Shouldn't a valid empty field parse to an empty string, rather than
|
I agree with the others here. The implementation shouldn't be retrofitted to work with incorrect data. Instead, the data needs to be correct. Thank you @sigmaSd, @timreichen and @luk3skyw4lker. |
Describe the bug
If
columns
option is given to the parser (or the same withskipFirstRow: true
) and a line doesn't contains the good number of columns an error is thrown and the whole process abord.Sometimes files doesn't contains all the columns, especilly without quotes and with TSV separator, and it's ok and good practice to reduce the file size...
Steps to Reproduce
Expected behavior
Put
undefined
on undefined columns and don't throw !!The text was updated successfully, but these errors were encountered: