Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.
Sign upReduce memory usage #248
Reduce memory usage #248
Comments
This comment has been minimized.
This comment has been minimized.
|
This is not supposed to happen. In node, webtorrent uses your system's tmp folder to store pieces to disk and only reads them back into memory from disk when a peer requests them. After that, they should be garbage collected. The browser is another story. Right now, we store the file entirely in memory in the browser because I haven't investigated what storage to use yet (it's not a top priority at the moment). Could you send a PR with your changes? We definitely can do a better job setting references to FWIW, I've streamed video files up to 5GB without problems. What version of node/iojs are you using? Platform? Version of webtorrent? |
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
|
For the record: I observed increasing memory usage too. Maybe it's the storage layer not freeing pieces after validation. |
This comment has been minimized.
This comment has been minimized.
|
I use In the original webtorrent i cannot see freeing any memory, node instance getting bigger and bigger |
This comment has been minimized.
This comment has been minimized.
|
Perhaps |
This comment has been minimized.
This comment has been minimized.
|
I think what the project needs is a clean abstraction interface that deals with file operations. I have looked at fs-storage.js under lib directory but it seems to be rather complicated. _onPiece Done deals with a piece object directly and such. What is needed is simple C level interface to do open, read, write, seek and such operations. Then one simply can implement various backends to store the files. Perhaps there is such an interface but I couldn't find it. I think https://github.com/filerjs/filer is great to handle the storage on browser side but first a proper file IO interface is needed. Insted of fs-storage, another module needs to find what piece drops into which offset in which file and call the appropriate functions in file IO interface. |
This comment has been minimized.
This comment has been minimized.
|
One should simply configure the storage backend on runtime. Simply a call to setStorageBackend on WebTorrent object. MemoryStorageBackend, BrowserStorageBackend can be implemented and set then. |
This comment has been minimized.
This comment has been minimized.
|
Any news regarding this? I too need to plug in a different storage backend. |
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
|
Tolga HOŞGÖR wrote:
Would this imply changing storage backends for running torrents? Such |
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
|
@astro I think you're fix is the right idea, but it causes the file data to get corrupted somehow and the tests fail, so we can't merge it as-is. I'm going to take another look at this soon. |
This comment has been minimized.
This comment has been minimized.
|
Yeah, it has been half a year... |
This comment has been minimized.
This comment has been minimized.
Not necessarily. Maybe it should be constructor option to avoid confusion. That operation would be really hard to implement and even libtorrent-rasterbar sometimes has weird behavior while moving the storage directory. There will also be various problems regarding interruption while changing the storage since it is working in a web page. Such feature will result in all kinds of headache and should not be part of webtorrent. User should manually move the data if that operation is required. That would imply removing/adding the torrent and rechecking though. Maybe an advanced option could be added to manually provide the resume data on torrent addition if it doesn't already exist. Then user could retrieve the resume data before deleting and moving and use that while adding back with a new path. |
This comment has been minimized.
This comment has been minimized.
|
This thread has been automatically locked because it has not had recent activity. To discuss futher, please open a new issue. |
hello, i started using webtorrent in a nodejs app and quickly i realized that in order to download a file webtorrent uses the main memory,I understand that this is necessary because it use torrents, pieces in order etc.. But this could cause problems in big files. Let say a file of 2gb would be loaded in main memory causing a large memory usage and in fact causing node to crash because it has a limit of 1gb memory limit, this could be solved with passing extra flags to force node use more memory but I don't think this is really a solution. I think the solution in this is to free buffers when a piece is verified and saved to disk, I already made some changes in my local project to setting null in piece buffer and block properties and then forcing v8's garbage collector to run. This (maybe not very good) solution worked and I managed to download a file of 3gb (before this i couldn't download more than 1gb).
PS. Still it use much memory, but in node working levels. Am I missing a reference to another buffer??