Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FATAL ERROR: JS Allocation failed - process out of memory #4

Closed
max-mapper opened this issue Dec 22, 2013 · 12 comments
Closed

FATAL ERROR: JS Allocation failed - process out of memory #4

max-mapper opened this issue Dec 22, 2013 · 12 comments

Comments

@max-mapper
Copy link
Collaborator

when doing

tar.pack('folder-a').pipe(tar.extract('folder-b'));

where the contents of folder-a is over roughly 1GB

I get FATAL ERROR: JS Allocation failed - process out of memory

@mafintosh
Copy link
Owner

The api seems to indicate that this was using tar-fs and not tar-stream right?

@max-mapper
Copy link
Collaborator Author

sorry, yea, wrong repo

its worth noting that this equivalent code doesnt run out of memory and completes, but takes longer than I think tar-stream would take if it wasnt leaking

var tar = require('tar')
var fstream = require('fstream')
var reader = fstream.Reader({type: "Directory", path: '/Users/max/Downloads/audiobooks/'})
var pack = tar.Pack()
reader.pipe(pack).pipe(tar.Extract({path: '/Users/max/Downloads/audiobooks2'}))

@mafintosh
Copy link
Owner

Which versions of tar-stream and tar-fs are you using? Also which version of node?

@max-mapper
Copy link
Collaborator Author

tar-fs@0.1.1, tar-stream@0.2.1, node@0.10.23

@mafintosh
Copy link
Owner

Using the latests versions I just copied a 3.7 gb directory using tar-fs in 7.5 seconds on my Macbook with no apparent leak. Using node-tar and the snippet you linked it still hasn't finished (been running for +5min now).
I'll try to see if I can reproduce the issue...

@mafintosh
Copy link
Owner

Do you have any symlinks in the folder you are trying to copy?

@max-mapper
Copy link
Collaborator Author

Hmmm, no symlinks. I first tried copying a 10gb folder full of mp3s, then a
sub folder of about 1gb of mp3s. Maybe it is related to the quantity of
files?

On Sunday, December 22, 2013, Mathias Buus wrote:

Do you have any symlinks in the folder you are trying to copy?


Reply to this email directly or view it on GitHubhttps://github.com//issues/4#issuecomment-31097152
.

@mafintosh
Copy link
Owner

Did the folder contain a lot of sub folders?

@max-mapper
Copy link
Collaborator Author

one of the folders that crashes every time: 338 documents, 27 folders, 4,324,597,805 bytes (4.33 GB on disk)

@mafintosh
Copy link
Owner

I think I've found (and solved) the issue. It was due to a bug being triggered by a long filename (>100 characters) that caused it to explode. It should be fixed now. Could you try upgrading tar-fs to 0.1.2 and try again?

@max-mapper
Copy link
Collaborator Author

nice, copies 485 documents, 25 folders, 9,279,350,569 bytes (9.28 GB on disk) in 38423ms and 338 documents, 27 folders, 4,324,597,805 bytes (4.33 GB on disk) in 17659ms

@mafintosh
Copy link
Owner

Nice! How long does it take to do it with node-tar?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants