Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve performance #22

Closed
bodgit opened this issue May 2, 2022 · 0 comments · Fixed by #23
Closed

Improve performance #22

bodgit opened this issue May 2, 2022 · 0 comments · Fixed by #23
Assignees
Labels
enhancement New feature or request

Comments

@bodgit
Copy link
Owner

bodgit commented May 2, 2022

Currently, every time a file within the archive is opened, a new copy of the decompressed stream is created and then read until the beginning of the file in question is reached. This means there is a performance hit that gets worse as you descend into the archive. To read the 10th file, you have to read and discard the first 9 files in the archive, to read the 100th file, you have to read past 99 files in the archive, etc.

A performance improvement would be to keep the decompressed stream reader around for any future files that are further along in the archive, (you can't go backwards).

@bodgit bodgit added the enhancement New feature or request label May 2, 2022
@bodgit bodgit self-assigned this May 2, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant