New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
heap out of memory with large file #485
Comments
That sounds abnormal. I take it you are using compression, given the size differences. Can you test it without compression? |
Against an uncompressed file ~65MB I get the same error. |
Just in case it's my code...
|
That's really bizarre. I'll try to take a look soon. |
Any update on this? |
Hi Ben, I reproduced your problem. The issue is actually the conversion to JS string at the end. Building strings in JS has always been a problem with large amounts of data because strings are immutable and you can't pre-allocate them, you have to build them character by character, which allocates a new string for every character you add. The new Encoding API might solve this, I haven't checked, but we don't use it at the moment. Fortunately, this doesn't matter usually for files, because usually we don't want to convert them to strings. Use the 'format: "binary"' option and you'll get a Uint8Array instead of a string and should have no problem--I just tested it with a 300 MB file. |
@bartbutler I'm running into the same problem. I need it in a string because I'm then parsing the contents of the file. Is there an easy way to use a |
@idris I looked at the Uint8Array-to-string conversion function and was able to optimize it a bit for large strings. V8/node seems to have a 250MB limit or so on string size but under that it should work for you. You do pay a bit of a performance penalty for the string conversion but it's not that bad (8s for a 200MB file on my laptop). I'm going to commit and deploy this fix as version 2.5.2. |
@bartbutler this seems to have worked. Thanks! |
@idris @bartbutler Can anyone help what piece of code solved this issue? In 2020 also I am facing the same issue with exact same piece of code @idris shared. |
@irfaan008 OpenPGP.js has a streaming API now, which should consume much less memory: (async () => {
const readableStream = fs.createReadStream('encrypted.gpg');
const writableStream = fs.createWriteStream('decrypted.txt');
const decrypted = await openpgp.decrypt({
message: await openpgp.message.read(readableStream),
publicKeys: publicKeyObj,
privateKeys: privateKeyObj,
format: 'binary'
});
const plaintext = decrypted.data;
plaintext.pipe(writableStream);
})(); Edit: also, 4.10.2 seems to have increased memory usage quite a bit. I'll release a fix for 4.10.3. |
I am trying to decrypt a file that is ~5MB encrypted and 65MB decrypted from node. If I do not increase node's memory to 2048MB the result is:
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
Is there a way to use a stream to avoid such a high memory requirement?
The text was updated successfully, but these errors were encountered: