-
-
Notifications
You must be signed in to change notification settings - Fork 789
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No way to read Deflate data in chunks. #269
Comments
https://github.com/nodeca/pako/blob/master/lib/deflate.js#L277 - you can override |
I guess it's just a matter of there not being an obvious way of how to achieve this without overriding default behavior, and whether that's something people think is worth adding to the base project. |
I think it's better to keep things as is, to keep API as simple as possible. |
Sounds good, I'll just leave a link to our fork repo here then in case anyone finds this issue in the future and wants an example of how to achieve this. |
Unless I'm missing something, because Deflate.chunks is marked internal, there is no way to read all the compressed data that has been written/flushed so far and subsequently clear the chunks array (which is needed when handling large objects that can't be all in memory at once). You currently have to keep everything in memory and read from Deflate.result at the end.
My organization forked pako and add a
flushResultBuffer()
method to Deflate which returns all the current chunks and then clears the chunks array. Curious if that's something it'd make sense to PR back into pako proper or if there's a way to read partial data that I've missed.The text was updated successfully, but these errors were encountered: