-
Notifications
You must be signed in to change notification settings - Fork 457
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Threads may accumulate when Node's garbage collector hasn't run #614
Comments
I have noticed same thing, don't have enough knowledge about back end. My test:
Node-webrtc is good and fulfill our requirements but this issue stopped me to use it for further and parallel our team looking for another open source solution. They started working on kurento server. @markandrus Mark Roberts, please help us to fix this, i completed my application on which i was working on last 30 days. |
@gev Is this useful for us? |
@savsofts have you close connection on server side when user disconnected? |
i checked with rest api (provided by examples) connection closed automatically after some time 4-5 seconds. |
I've noticed not closed connections on the server side leave all 7 or 8 child processes. Could you try to close the peer connections on the server after disconnecting? Example:
|
yes, i captured statechange in console.log and i am getting failed and disconnect status |
could you close a connection on failed or disconnected status and check a memory usage? |
@savsofts @markandrus this simple test shows child processes leakage:
|
this code add 4 child processes every interval:
and this only 2 child processes:
|
Why do you claim there is a process leak? I do not believe the webrtc.org code spawns any processes, nor does node-webrtc itself. Moreover,
Then, in another terminal:
Per
Therefore, there should be no child processes being spawned. |
@markandrus run or use
It's threads. I think it has been created for ICE and DTLS/SCTP/SRTP transports |
OK, you are referring to a thread leak and not process leak. Indeed you can observe that the number of threads accumulates (until a garbage collection) in your example. I will update the issue title accordingly. Analysis In node-webrtc, we create our worker and signaling threads here and here in the PeerConnectionFactory, respectively. They both get stoppped in its destructor here. We also create a thread for TestAudioDeviceModuleImpl, which has a 1:1 mapping with each PeerConnectionFactory. A PeerConnectionFactory (and its TestAudioDeviceModuleImpl) exist so long as at least one non-stopped RTCPeerConnection exists. In your example, you create an RTCPeerConnection and immediately close it—totally fine—which means you are creating a PeerConnectionFactory on demand each time, and throwing it away—also totally fine. If node ran garbage collection continuously, these would continuously be cleaned up, as I'll show below. Instead, garbage collection does not run continuously, and so they appear to accumulate. I made a small change locally to name these threads, and then re-ran the test, attached with
These threads are accumulating because node has not yet run a garbage collection. If I extend the example to explicitly run
Questions/Notes
|
And for this code
(when close connection at same time) I have segmentation fault! |
@markandrus can I help? |
@gev @markandrus pstree -p shows 648 node thread with 120 concurrent broadcast stream. |
gc.collect() after closing the RTCPeerConnection does the trick. Otherwise, in my case I end up with hundreds of threads and increasing cpu usage over time.
This may be the correct solution. |
@markandrus Hi, we also drew on the memory leak. |
At server side I get an offer without any media tracks and with two data channels.
For the first RTCPConnection Nodejs creates 8 child processes.
For every next RTCPConnection Nodejs creates 7 child processes.
After closing RTCPConnection Nodejs kills only 4 (for the first) or 3 (for every next) child process.
So for every open/close cycle Nodejs create 4 child processes!
Is it ok?
The text was updated successfully, but these errors were encountered: