You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am getting an error when reading a file larger than ~2 GB. I get
"fs.js:319
throw new ERR_FS_FILE_TOO_LARGE(size);
^
RangeError [ERR_FS_FILE_TOO_LARGE]: File size (6369042087) is greater than possible Buffer: 2147483647 bytes"
After some research, I see this is a buffer limitation in Node however I do not have this issue with tippacanoe so I am assuming they are doing some sort of stream like https://stackoverflow.com/a/44994896 I am doing clustering on a national dataset so splitting the file seems less than ideal as I would assume it would mess up clustering unless I am missing something. Thank you
The text was updated successfully, but these errors were encountered:
I am getting an error when reading a file larger than ~2 GB. I get
"fs.js:319
throw new ERR_FS_FILE_TOO_LARGE(size);
^
RangeError [ERR_FS_FILE_TOO_LARGE]: File size (6369042087) is greater than possible Buffer: 2147483647 bytes"
After some research, I see this is a buffer limitation in Node however I do not have this issue with tippacanoe so I am assuming they are doing some sort of stream like https://stackoverflow.com/a/44994896 I am doing clustering on a national dataset so splitting the file seems less than ideal as I would assume it would mess up clustering unless I am missing something. Thank you
The text was updated successfully, but these errors were encountered: