-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
large files not converting #64
Comments
I am having same Issue, When using stream, Also found that issue is with
It happens for last chunk which happen to be at the smaller than the last stored json part. |
@m-prof r you using the stream feature from jsonexport? |
I am just using the basic command listed on the main page (http://kauegimenes.github.io/jsonexport/), For reference I attach a sample file. |
@m-prof can you try again using the branch |
Thanks! But I am still getting an empty file. You can confirm with the sample file I sent. |
@m-prof i did test with the sample you provided and got a csv with 322 lines. try this: git checkout bugfix/stream-memory-limit
./bin/jsonexport.js sample.json output.csv |
OK good it is something on my end! Are you doing this within Windows 10? I am not sure how to execute those commands you mentioned within Win 10 (e.g. if try |
@m-prof can you try using |
@m-prof feel free to reopen this issue if this fix does not work for you. |
Thanks so much kaue! This works now for me, for all files I checked. I have a speed question which I I am posting in a new thread: #75 |
I have a series of JSON which are only about 100k (or greater) in size. But when I try to convert I get an empty file. Smaller files convert fine.
This is from within a Windows 10 environment.
The text was updated successfully, but these errors were encountered: