-
-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FATAL ERROR: JS Allocation failed - process out of memory #4
Comments
The api seems to indicate that this was using tar-fs and not tar-stream right? |
sorry, yea, wrong repo its worth noting that this equivalent code doesnt run out of memory and completes, but takes longer than I think var tar = require('tar')
var fstream = require('fstream')
var reader = fstream.Reader({type: "Directory", path: '/Users/max/Downloads/audiobooks/'})
var pack = tar.Pack()
reader.pipe(pack).pipe(tar.Extract({path: '/Users/max/Downloads/audiobooks2'})) |
Which versions of tar-stream and tar-fs are you using? Also which version of node? |
|
Using the latests versions I just copied a 3.7 gb directory using tar-fs in 7.5 seconds on my Macbook with no apparent leak. Using node-tar and the snippet you linked it still hasn't finished (been running for +5min now). |
Do you have any symlinks in the folder you are trying to copy? |
Hmmm, no symlinks. I first tried copying a 10gb folder full of mp3s, then a On Sunday, December 22, 2013, Mathias Buus wrote:
|
Did the folder contain a lot of sub folders? |
one of the folders that crashes every time: 338 documents, 27 folders, 4,324,597,805 bytes (4.33 GB on disk) |
I think I've found (and solved) the issue. It was due to a bug being triggered by a long filename (>100 characters) that caused it to explode. It should be fixed now. Could you try upgrading tar-fs to 0.1.2 and try again? |
nice, copies |
Nice! How long does it take to do it with node-tar? |
when doing
where the contents of
folder-a
is over roughly 1GBI get
FATAL ERROR: JS Allocation failed - process out of memory
The text was updated successfully, but these errors were encountered: