Skip to content
This repository has been archived by the owner on Apr 22, 2023. It is now read-only.

memory leak #25644

Closed
NikSmith opened this issue Jul 7, 2015 · 2 comments
Closed

memory leak #25644

NikSmith opened this issue Jul 7, 2015 · 2 comments

Comments

@NikSmith
Copy link

NikSmith commented Jul 7, 2015

Hi. I watch the memory is up to much.
node: 0.12.6

example app.js:

var vm = require("vm");
var result = false;
while (true){
result = vm.runInNewContext("1==1",{});
console.log(result);
console.log(parseInt(process.memoryUsage().heapTotal / 1024 / 1024));
}

For each iteration memory increase by 1Mb.
I created a issue report on http://code.google.com/p/v8/issues/detail?id=4281, but I was sent here :)

@misterdjules
Copy link

TL;DR I don't think your sample code shows a memory leak. I would say that this code doesn't let V8's garbage collector run often enough, and that it shows heapTotal instead of heapUsed, which are two very different values. heapUsed is more representative of what JavaScript objects are currently live in memory, heapTotal is more representative of what V8 asks the OS to commit to, and has more to do with how V8 decides to manage its memory footprint.

Now for a longer version. Memory leaks in node are very often quite complex to investigate because it generally depends on how V8's garbage collector can manage to run.

In your example, allocating memory by running vm.runInNewContext in a tight loop, V8's garbage collector doesn't have a lot of opportunities to run, and thus it will run when it really needs to, which means when the V8 heap already has a lot of objects. If you let this program run long enough, you'll see that heapTotal actually reaches a limit at around 1400 MBs and oscillates around this limit.

Now another data point that is more interesting to have is process.memoryUsage().heapUsed, which shows the amount of memory for JavaScript objects that are alive. Using your sample code, you'll see that that value is smaller than heapTotal and shrinks much more significantly when the garbage collector kicks in.

What's interesting is that the garbage collector's behavior can depend on subtle things. For instance with your sample code, building node with or without support for snapshot changes how much and how often the garbage collector can run. I suspect this is because running a script in a new context is faster with snapshots, and leaves some time for the garbage collector to run, but that's a wild guess. Official releases of node are built without snapshots, and thus seem to struggle more to reclaim objects from the heap with your sample code. If you build the source yourself with ./configure && make (thus not disabling snapshots) you'll see that when running your sample code, heapTotal reaches the same value, but heapUsed stays around 300 MBs.

To get a behavior that is closer to what we intuitively think is better (having the heap not grow between iterations of a tight loop), it is possible to force V8's garbage collector to run by calling global.gc() and passing --expose-gc on the command line.

Doing that with your sample code, both heapTotal and heapUsed values still grows steadily, and it seems that this could be due to a bug in copying global properties in the contextify module. Rebuilding node with the following patch:

➜  v0.12 git:(v0.12) ✗ git diff
diff --git a/src/node_contextify.cc b/src/node_contextify.cc
index 2e8fd2c..b53f04d 100644
--- a/src/node_contextify.cc
+++ b/src/node_contextify.cc
@@ -171,6 +171,7 @@ class ContextifyContext {
           Local<String> code = FIXED_ONE_BYTE_STRING(env()->isolate(),
               "(function cloneProperty(source, key, target) {\n"
               "  if (key === 'Proxy') return;\n"
+              "  if (key === 'gc') return;\n"
               "  try {\n"
               "    var desc = Object.getOwnPropertyDescriptor(source, key);\n"
               "    if (desc.value === source) desc.value = target;\n"
➜  v0.12 git:(v0.12) ✗

and running your sample code + a call to global.gc() at the end of the loop with ./node --expose-gc test.js gives me almost constant heapUsed and heapTotal values (respectively at 4 MBs and 40 MBs).

To conclude, I will close this issue as I don't think it shows any issue with node itself, but feel free to comment further if you have any question.

The need for the patch I just mentioned above to make --expose-gc and global.gc() work as I understand it seems like it could be a valid bug though. @indutny @bnoordhuis @trevnorris Any thought about this specifically?

@NikSmith
Copy link
Author

Hi. Thank you for your detailed answer.
This code was a simple example.

I try run it with setInterval and also use global.gc(), but still I watched a memory leak.
Yerstoday, I try run vm with nodejs 4.0.0, and it worked is ok.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants