Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.
Sign upMemory leak? WebTorrent API eats up 6gb RAM, then crashes with "Out of memory" #479
Comments
This comment has been minimized.
This comment has been minimized.
|
I think because of browser capabilities, the library loads the entire file into memory before seeding. since your ram is less than the file size it will not work. |
This comment has been minimized.
This comment has been minimized.
|
@blairanderson |
This comment has been minimized.
This comment has been minimized.
|
@blairanderson is there a way to avoid that? Torrentstream (peerflix) does
|
This comment has been minimized.
This comment has been minimized.
|
i guess u can use https://github.com/feross/webtorrent-hybrid. |
This comment has been minimized.
This comment has been minimized.
|
I thought the only difference with hybrid was webrtc support? How would
|
This comment has been minimized.
This comment has been minimized.
|
maybe look at #248 (and no webtorrent-hybrid wouldn't help for this issue) |
This comment has been minimized.
This comment has been minimized.
|
@tsoernes Any changes since the implementation of peer destorying? |
This comment has been minimized.
This comment has been minimized.
|
No changes:
I don't know how peer destroying, whatever that is, would help. It crashes when loading the torrent, before finding peers. |
This comment has been minimized.
This comment has been minimized.
|
@tsoernes I'm taking a look at this issue now. Is there any chance you could re-run with the latest version of webtorrent (0.72.1) and enable debug logs? You can do that by setting the
Thanks! |
This comment has been minimized.
This comment has been minimized.
|
I can't reproduce the issue with the latest version of webtorrent or the version you were using. I'm using your sample code and a torrent with really large files (25GB). @tsoernes Your debug logs would really help here. |
This comment has been minimized.
This comment has been minimized.
|
We can't reproduce this issue without debug logs. Closing. |
This comment has been minimized.
This comment has been minimized.
i tried to download 4 gb file, i got this error :`Started saving English(SDH).srt RangeError: Invalid typed array length debug logs:
Error: write EPIPE |
This comment has been minimized.
This comment has been minimized.
|
@ali-khabbaz Do not post links or reference to copyrighted material that you do not own. Do not use WebTorrent for copyright infringement. |
This comment has been minimized.
This comment has been minimized.
|
This thread has been automatically locked because it has not had recent activity. To discuss futher, please open a new issue. |
Hi.
(Webtorrent) v0.62.3, node v5.0.0)
I'm running the following code:
When the code is called, the V8 engine eats up more than 6 GB RAM (in a couple of seconds) and then crashes:
FATAL ERROR: node::smalloc::Alloc(v8::Handle<v8::Object>, size_t, v8::ExternalArrayType) Out Of MemoryAfter upgrading from Node v0.14 to v5.0.0, the error is changed to this:
The torrent has 8GB of files.

I tried with a smaller torrent. It runs without error, but uses a lot of memory. WebTorrent is the only thing running in the code: