Skip to content
This repository has been archived by the owner on Apr 22, 2023. It is now read-only.

http client leaking memory in 0.6.7 (not 0.4.9) #2577

Closed
davepacheco opened this issue Jan 18, 2012 · 16 comments
Closed

http client leaking memory in 0.6.7 (not 0.4.9) #2577

davepacheco opened this issue Jan 18, 2012 · 16 comments

Comments

@davepacheco
Copy link

I've got two node programs: (1) an end server that services HTTP requests, and (2) a "proxy" that receives HTTP requests and forwards them to the end server. With v0.4.9, both of these behave well under load. Under v0.6.7, the HTTP client uses a lot more memory, as measured by the OS RSS value. Leaving them running overnight results in requests that had been taking 40-50ms now taking upwards of several seconds.

Here's a graph comparing RSS between 0.4.9 and 0.6.7 under the same workload, which was an apachebench run:
https://skitch.com/dapsays/g2mj9/memory-usage

Here's the code I'm using:
https://gist.github.com/55d92fc6cf70e84bd154

As described in that gist, I've found that by adding a call to "subresponse.destroy()" to destroy the client HTTP response object in its "end" event handler, memory usage is much closer to normal (though still leaks a bit).

Without the subresponse.destroy() call, it looks like this:
https://skitch.com/dapsays/g37w5/node.js-0.6.7-memory-usage-without-fix-applied

With subresponse.destroy(), it looks like this:
https://skitch.com/dapsays/g37w6/node.js-memory-usage-with-subres

(Note that the scales are different, too. The second graph has a much smaller y-axis.)

The workload is this:

# ab -c 100 -n 100000 http://10.99.99.24:8085/

And the actual numbers:

  • With subresponse.destroy(): test.js went from 9.16-10.3MB before to 16.0 - 17.2MB after.
  • Without subresponse.destroy(): test.js went from 9.16-11.4MB before to 61.8 - 64.1MB after.

I don't have the exact numbers from 0.4.9, but I wasn't seeing any leakage at all.

This looks like #1968, but these data show that it's not likely heap fragmentation since destroying the responses causes the memory to get cleaned up.

@davepacheco
Copy link
Author

As suggested in IRC, I tried with current 0.6 (07701e7) but got the same results:
https://skitch.com/dapsays/g3687/0.6.8-pre-memory

@davepacheco
Copy link
Author

I tried setting "agent: false" in the http.request() options and this made a significant difference. On the first 100K requests, it looked like this:

https://skitch.com/dapsays/g36c2/0.6.8-pre-with-agent-false
            before: 8.01MB - 9.16MB
            peak: 49.2MB - 50.4MB
            after: 17.2MB - 18.3MB
            a little later: 12.6 - 13.7MB

after another 100K requests:

https://skitch.com/dapsays/g36xt/0.6.8-pre-with-no-agent-again
            before: 12.6 - 13.7MB
            peak: 50.4 - 51.5MB
            after: 18.3 - 19.5MB

and it didn't come back down any more than that. (I also killed the child server to see if that would cause resources to be cleaned up, but even after a few minutes it hasn't reduced RSS.)

So it looks like the biggest leak is in the agent stuff, but we're still losing 1-2MB per 100K requests. The big leak is significant for our service, which handles 1 request per second per client, so this is a blocker for us moving to node v0.6.

@koichik
Copy link

koichik commented Jan 19, 2012

The results of various versions of Node using global Agent, without subresponse.destroy(), on Ubuntu 11.04:

Node    V8          http    back    before    peak   after
======  ==========  ======  ======  ======  ======  ======
0.4.12  3.1.8.26    http1   legacy    10.0    54.5    15.7
0.5.0   3.1.8.25    http1   legacy    10.0    22.9    16.9
0.5.1   3.4.10      http1   legacy     9.9    30.0    16.5
0.5.2   3.4.14      http1   legacy    10.1    57.9    16.9
0.5.3   3.4.14      http1   legacy     9.7    57.0    15.9
0.5.3   3.4.14      http2   legacy     9.9    83.7    15.4
0.5.3   3.4.14      http2   uv         9.1    81.9    18.9
0.5.4   3.5.4.3     http1   legacy     9.4    57.2    16.2
0.5.4   3.5.4.3     http2   legacy     9.9    86.2    17.9
0.5.4   3.5.4.3     http2   uv         9.4    82.6    18.5
0.5.5   3.5.8       http1   legacy    10.9    56.8    16.2
0.5.5   3.5.8       http2   legacy     9.7    96.8    21.1
0.5.5   3.5.8       http2   uv         9.2    78.5    17.7
0.5.6   3.6.2       http1   legacy     9.9    56.8    16.2
0.5.6   3.6.2       http2   uv         9.7   100.8    20.9
0.5.7   3.6.4       http1   legacy    10.1    56.6    16.0
0.5.7   3.6.4       http2   uv         9.4   104.2    25.8
0.5.8   3.6.4       http1   legacy     9.4    57.8    18.1
0.5.8   3.6.4       http2   uv         8.9    94.9    25.1
0.5.9   3.6.4       http2   uv         9.4    98.0    28.8
0.5.10  3.7.0       http2   uv         8.7   196.7   190.8
0.6.0   3.6.6.6     http2   uv         9.9   112.1    36.9
0.6.1   3.6.6.7     http2   uv         9.4   111.6    32.0
0.6.2   3.6.6.8     http2   uv         9.5   111.6    31.4
0.6.3   3.6.6.8     http2   uv         9.9   111.3    32.4
0.6.4   3.6.6.8     http2   uv         9.5   111.5    32.2
0.6.5   3.6.6.11    http2   uv         9.8    89.4    26.1
0.6.6   3.6.6.14    http2   uv         9.5    97.8    28.0
0.6.7   3.6.6.15    http2   uv         9.9   108.1    32.2
0.7.0   3.8.6       http2   uv        10.0   211.3   188.5

It seems that another problem exists with V8 3.7/3.8.

@koichik
Copy link

koichik commented Jan 19, 2012

Here is the standalone client with a global Agent:

var http = require('http');
var host = '127.0.0.1';
var port = 8081;
var max = parseInt(process.argv[2] || '100000', 10);
var count = 0;

function makeRequest() {
  ++count;
  var req = http.request({
    host: host,
    port: port,
    path: '/' + count
  }, function(res) {
    res.on('end', function() {
      if (count < max) {
        makeRequest();
      }
    });
  });
  req.end();
}

console.log(count, process.memoryUsage());
for (var i = 0; i < 100; ++i) {
  makeRequest();
}

setInterval(function() {
  console.log(count, process.memoryUsage());
}, 5000);

result (with Node v0.6.7):

request   before    peek   after
======= ======== ======= =======
 100000     10.4    75.4    18.4
 200000     10.2    75.6    18.4
 500000     10.3    76.0    18.4
1000000     10.2    76.1    17.5

An Agent does not seem to leak...

@piscisaureus
Copy link

When running v8 with --heap-stats (that may be available only in the debug build of node), it tells me that there are ~ 8000 weak references left after processing about 1M requests. I suppose these weak refs refer to buffers. That is a bit surprising.

@piscisaureus
Copy link

@bnoordhuis will look it as well

@koichik
Copy link

koichik commented Jan 20, 2012

http client leaks http-parser.

using global Agent, without subresponse.destroy(), ab -c 100 -n 100000, on Ubuntu 11.04:

v0.6.8              before: 9.5M, peak: 127.2M, after 48.1M.
the patch applied:  before: 9.9M, peak:  54.0M, after 13.4M.
diff --git a/lib/http.js b/lib/http.js
index c4e0d54..95d7001 100644
--- a/lib/http.js
+++ b/lib/http.js
@@ -1158,6 +1158,13 @@ ClientRequest.prototype.onSocket = function(socket) {
           socket.destroy();
         }
       }
+      if (parser.incoming && parser.incoming.complete) {
+          parser.finish();
+          parsers.free(parser);
+          parser.incoming = null;
+          parser = null
+          req.parser = null;
+      }
     };

     socket.onend = function() {
@@ -1167,8 +1174,10 @@ ClientRequest.prototype.onSocket = function(socket) {
         req.emit('error', createHangUpError());
         req._hadError = true;
       }
-      parser.finish();
-      parsers.free(parser); // I don't know if this is necessary --Mikeal
+      if (parser) {
+        parser.finish();
+        parsers.free(parser); // I don't know if this is necessary --Mikeal
+      }
       socket.destroy();
     };

@piscisaureus
Copy link

@mikeal Can you review this patch?

@koichik
Copy link

koichik commented Jan 20, 2012

Please review koichik/node@9cd1e90.

@mikeal
Copy link

mikeal commented Jan 20, 2012

looks good, as long as all the tests pass.

@mikeal
Copy link

mikeal commented Jan 20, 2012

can we make sure this goes in to 0.6.x and master?

@isaacs
Copy link

isaacs commented Jan 20, 2012

@koichik This is awesome data, thank you!

@davepacheco
Copy link
Author

@koichik's patch works well for me. I ran two sets of 100K requests:
https://skitch.com/dapsays/g4u1j/memory-usage

Memory usage:
start: 9.16 - 9.73MB
peak: 32.8 - 33.6MB
after first 100K: 13.0 - 13.7MB
after second 100K: 13.0 - 13.7MB

There's some loss after the first 100K, but that could well be lazily initialized structures, not per-request leakage.

@isaacs
Copy link

isaacs commented Jan 21, 2012

Landed on 35fe3eb. Thanks, @koichik!

@defunctzombie
Copy link

Would the same issue have plagued https connections? For the longest time I saw growing memory usage with my https client requests and was never able to pinpoint it as well as done here.

@mikeal
Copy link

mikeal commented Jan 21, 2012

yes, this effects https, but it shouldn't effect it any more than it would http.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

6 participants