Track tasks and feature requests
Join 40 million developers who use GitHub issues to help identify, assign, and keep track of the features and bug fixes your projects need.
Sign up for free See pricing for teams and enterprisesExtreme memory consumption when restarting torrent #441
Labels
Comments
This comment has been minimized.
This comment has been minimized.
|
@ngjermundshaug Thanks for the kind words, sorry for not replying sooner. This issue was fixed on Jan 9, 2016 in 664eb30, since v0.72.1. Before, we'd try hashing every piece in the torrent in parallel, which is usually ~1000 pieces per torrent. Now, it's limited to the number of cores on the machine. Please let me know if this improves the situation! |
This comment has been minimized.
This comment has been minimized.
|
This thread has been automatically locked because it has not had recent activity. To discuss futher, please open a new issue. |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi
First off - awesome library! Love it!
One challenge though. But when I stop and restart the download of a torrent - the memory consumption goes haywire. I've tested this with a torrent with 2172 files - total size is 9GB - 1MB piece size.
It loads a whopping 9GB into memory. I'm guessing this is related to rechecking all the pieces. I'm running this is NW.JS - so the UI freezes when this being done. After about 10 secs - the memory consumption returns to normal - and the UI is responsive again.
Same thing happens when I restart the program and resume the download of a torrent that is not finished - which has to be rechecked.
How should I get around this issue?
Can you make rechecking pieces/hashing more memory friendly?