You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a massive csv file (54 GB, ~2.7 billion records) that I'm trying to process in chunks using chunksize. The csv reader stopped before hitting the end of the file. Upon further investigation, I found that it stopped after reading 2147483648 rows, which, is (2**32)/2. I assume there's an overflow happening somewhere inside the cython code for read_csv.
The text was updated successfully, but these errors were encountered:
I have a massive csv file (54 GB, ~2.7 billion records) that I'm trying to process in chunks using chunksize. The csv reader stopped before hitting the end of the file. Upon further investigation, I found that it stopped after reading 2147483648 rows, which, is (2**32)/2. I assume there's an overflow happening somewhere inside the cython code for read_csv.
The text was updated successfully, but these errors were encountered: