-
Notifications
You must be signed in to change notification settings - Fork 793
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Streaming decryption fails on files larger than 2GiB #1012
Comments
Hey 👋 This is a limitation of Chrome / V8; it has a max Uint8Array size of 2GiB. It's possible that you could work around this by using streaming decryption: https://github.com/openpgpjs/openpgpjs/releases/tag/v4.0.0. |
We found a workaround by decrypting the larger files in bash but still not a clean fix. I tried switching to streaming both the input encrypted file and output decrypted stream but still yields the same error |
@ama4tf For the streaming solution, please see https://github.com/openpgpjs/openpgpjs/releases/tag/v4.0.0, specifically "Streaming CFB". I think it should be possible to stream files larger than 2GB, although I haven't tried it. |
I tried Streaming CFB and that did yield a different kind of error, so still not successful but maybe progress
Traceback on the error looks like this at Object.concat (\node_modules\openpgp\dist\openpgp.js:24175:23) |
OK yeah, this is a bug in our streaming implementation, that we should fix. |
Hey @lekhacman, yes of course, if you want to take a stab at it feel free. AFAIK, the root cause of this error is this line: openpgpjs/src/packet/packet.js Line 272 in 25118c3
There is a long comment above it explaining what it does and why. Under normal circumstances, I don't think this line should be reading 2GB even if the message is larger than 2GB. The first step to fixing this is probably to find out why it does so. I would recommend working from the v5 branch, and if you do find a fix, sending a PR there. Otherwise this may lead to a lot of rebasing work in the future. Good luck! |
@twiss thank you for the hint :) |
@lekhacman Did you did you make any strides while working on this? Thank you for your time. |
@fluxquantum i'm still reading & trying to understand the structure of the library which is difficult. |
@lekhacman Really appreciate you for the quick follow up. May I know which library you are using to perform the streaming encryption/decryption? Just to share some more context, in my case there's some more complexity, because I have a java app performing the encryption and javascript for decryption. I am hoping the java solution can be compatible with the javascript approach. Thanks again for the guidance. |
Hi, I have the same problem using v5 and streams. Here is an example of code which fails with error const message = await openpgp.readMessage({ binaryMessage: streamRead });
const streamRead = fs.createReadStream('bigFile.pgp');
const privateKey = await openpgp.readPrivateKey({ armoredKey: '...' });
const streamDecrypted = yield openpgp.decrypt({
message,
decryptionKeys: privateKey,
format: 'binary'
}); "bigFile.pgp" is a random file 10Gb file that has been encrypted with ECC/curve25519 but it seems that the problem is the same with RSA and not related to the encryption type. During the decryption we can see the Node.js process eating more and more RAM. I made some debugging guesses and it seems that this call eats the memory: Line 131 in 6da1c53
The thing is I don't understand how stream is handled here. Is it? Note than encryption is working as expected :) Hope these informations will help. |
Hi @Bacto ,
Have you already tried passing |
@larabr I did it and... it works!! Thank you so much :) |
Closing assuming this is fixed then, @ama4tf let us know if you still have issues when using the |
Running decrypt on large files fails due to array overrun. This was generated by calling openpgp.decrypt using an encrypted file-stream of 2.1gb in length, a private key, and the binary format. Failure message is below:
Error decrypting message: Invalid typed array length: 2322369312
The text was updated successfully, but these errors were encountered: