Possible memory leak with class Error #4910
Comments
|
NB: We didn't observe such a difference when running a similar code in Chrome DevTools (but maybe Chrome measuring tools are not the same as heapdump?) |
|
I don't really see anything out of the ordinary here. The If you're wondering why the snapshot for |
|
@bnoordhuis an array containing 1000 refs to the same immutable string literal is on the order of one kilobyte. so it should have the same problem. |
In memory, but the presentation in the snapshot is - in principle - much simpler, it's just the same edge many times over. That said, I just ran the example and I ended up with a 1.3 GB snapshot...
Yes, the closures keep the objects alive but I don't see how that is a bug. How would you expect it to work otherwise? The GC can't free the err objects when the closures are still reachable. |
|
@bnoordhuis For the difference between the size in ram and the representation in the snapshot, that's true, but we opened the snapshot each time in chrome dev tool where it shows the actual size on disc. It was different that 1.7 and 6.4, but in the same order, so I don't think that's the pb. And for the GC part : the closure is not reachable after the setTimeout is executed ! That's why we did this convoluted exemple with async. This code shouldn't have any non-cleanable objects in memory when we call heapdump the second time. Else all the async non blocking server apps writen in nodejs would have big problems... |
@bnoordhuis If I change the test to |
|
I modified the test case to not use async (but kept the setTimeout) so all 50,000 elements are retained. async was one moving part too many for clear analysis. Fun fact: the version with |
Nope, that's not it. This code should be used to get unique strings: var t = 0;
function BigObj() {
this.message = "hello";
this.x = [];
for (var i = 0; i < 1000; ++i) {
this.x.push("mlkj" + (t++));
}
}There are a lot of |
|
I played around with it some more and as far as I can determine, the issue is simply retainment through closures and the |
|
@bnoordhuis "seq" don't have any references to the functions in the first exemple, why would not cleaning "seq" prevent the gc to collect stuff ? Could you post your code please ? |
Here you go: var seq = [];
for (var i = 0; i < 50000; i++) seq[i] = i;
function BigObj() {
this.message = "hello";
this.x = [];
for (var i = 0; i < 1000; ++i) this.x.push("mlkj");
}
seq = seq.map(function() {
return new BigObj();
// return new Error();
});
// process.seq = seq; // retain
seq = [];
require("../../node-heapdump").writeSnapshot();If you uncomment the assignment to |
|
@bnoordhuis Your map just creates an array of BigObj. So yeah if you keep a reference to this array, it can't be GCed, else it can. @ChALkeR the strings are really besides the point, BigObj could be an array of 1000 null's, and it's enough to show the bug. 50000 arrays of 1000 null are already big enough to show if they were GCed or not. (as you can see in bnoordhuis exemple, when they are not GCed, it's already BIG, 1.3G) @bnoordhuis if you absolutely want to get rid of the async.series, you should write the following, and pray that in 10 mins all the 50000 setTimeouts will be finished ; but it's really not serious for testing, and I think using async doesn't make the code too strange to be accepted here (and it's something than really used by everyone really often now)
|
|
For me, your script produces snapshots in both cases that are:
|
|
Maybe try running with |
|
@bnoordhuis With which script did you mesure those numbers please ? The first script given by sgeraud or the last one with the simple for-loop that I gave as exemple ? |
|
The last one. |
|
@bnoordhuis Yes, I gave it as an exemple of how you could modify your first exemple to be closer to ours, but I didn't run it, as I said it's really not practical to test with this, puting a temporisation on the second dump and all. The first script (the one inside the issue description) is the one that we have tested on many config and that reproduces the bug every time. I don't know what the process usually is to report bug her, I wouldn't want to presume anything; but I truely feel that the first script is a really standard nodejs usecase. All our production code is done like this: get the request in a route, start a series of asynchronous calls (in parallel or series depending on what we are doing), then respond to the client, and all this on a always-on express server. And one of our wrapper creates a "new Error" to capture the stack in case something goes wrong, then is (supposedly) GCed when the async.series/parallel is completed. And we have a memory leak ! |
|
Okay, can you come up with a minimal test case that demonstrates the memory leak (if that what it is) without depending on third-party modules? |
|
This example shows the memory leak:
With BigObj:
With Error:
|
|
Thanks, I can reproduce now but I'm not sure it's a solvable issue. First off, the problem lies with V8, not node.js; it's not an issue we have direct control over. What happens is that the If you run the test with It's possible it's been fixed or alleviated in V8 4.9+ but I haven't tested that. You may want to consider filing a bug over at https://bugs.chromium.org/p/v8/issues/list. |
|
@bnoordhuis Thank's a lot for all the time you took investigating this issue ! |
|
Talk about coincidence: I was going to post that I suspect this is v8:2340 but I see the issue you reported was closed today as a duplicate of that selfsame issue. I'll close this issue. The V8 team thinks it's not a bug so there isn't much we can do. |
|
Yes indeed. Thank's for your help on this issue @bnoordhuis ! |
The following works well, but replacing line
new BigObj()bynew Error()results in a very large heap snapshot, even ifheapdump.writeSnapshot();triggers garbage collection.Snapshots for
BigObj:Snapshots for
Error:Observed with node v0.12.7 and node v5.5.0 on Mac OS X 10.11.2
and node v5.41. on Ubuntu 14.04.3
The text was updated successfully, but these errors were encountered: