http client leaking memory in 0.6.7 (not 0.4.9) #2577
Comments
As suggested in IRC, I tried with current 0.6 (07701e7) but got the same results: |
I tried setting "agent: false" in the http.request() options and this made a significant difference. On the first 100K requests, it looked like this:
after another 100K requests:
and it didn't come back down any more than that. (I also killed the child server to see if that would cause resources to be cleaned up, but even after a few minutes it hasn't reduced RSS.) So it looks like the biggest leak is in the agent stuff, but we're still losing 1-2MB per 100K requests. The big leak is significant for our service, which handles 1 request per second per client, so this is a blocker for us moving to node v0.6. |
The results of various versions of Node using global
It seems that another problem exists with V8 3.7/3.8. |
Here is the standalone client with a global var http = require('http');
var host = '127.0.0.1';
var port = 8081;
var max = parseInt(process.argv[2] || '100000', 10);
var count = 0;
function makeRequest() {
++count;
var req = http.request({
host: host,
port: port,
path: '/' + count
}, function(res) {
res.on('end', function() {
if (count < max) {
makeRequest();
}
});
});
req.end();
}
console.log(count, process.memoryUsage());
for (var i = 0; i < 100; ++i) {
makeRequest();
}
setInterval(function() {
console.log(count, process.memoryUsage());
}, 5000); result (with Node v0.6.7):
An |
When running v8 with --heap-stats (that may be available only in the debug build of node), it tells me that there are ~ 8000 weak references left after processing about 1M requests. I suppose these weak refs refer to buffers. That is a bit surprising. |
@bnoordhuis will look it as well |
http client leaks http-parser. using global Agent, without
diff --git a/lib/http.js b/lib/http.js
index c4e0d54..95d7001 100644
--- a/lib/http.js
+++ b/lib/http.js
@@ -1158,6 +1158,13 @@ ClientRequest.prototype.onSocket = function(socket) {
socket.destroy();
}
}
+ if (parser.incoming && parser.incoming.complete) {
+ parser.finish();
+ parsers.free(parser);
+ parser.incoming = null;
+ parser = null
+ req.parser = null;
+ }
};
socket.onend = function() {
@@ -1167,8 +1174,10 @@ ClientRequest.prototype.onSocket = function(socket) {
req.emit('error', createHangUpError());
req._hadError = true;
}
- parser.finish();
- parsers.free(parser); // I don't know if this is necessary --Mikeal
+ if (parser) {
+ parser.finish();
+ parsers.free(parser); // I don't know if this is necessary --Mikeal
+ }
socket.destroy();
};
|
@mikeal Can you review this patch? |
Please review koichik/node@9cd1e90. |
looks good, as long as all the tests pass. |
can we make sure this goes in to 0.6.x and master? |
@koichik This is awesome data, thank you! |
@koichik's patch works well for me. I ran two sets of 100K requests: Memory usage: There's some loss after the first 100K, but that could well be lazily initialized structures, not per-request leakage. |
Would the same issue have plagued https connections? For the longest time I saw growing memory usage with my https client requests and was never able to pinpoint it as well as done here. |
yes, this effects https, but it shouldn't effect it any more than it would http. |
I've got two node programs: (1) an end server that services HTTP requests, and (2) a "proxy" that receives HTTP requests and forwards them to the end server. With v0.4.9, both of these behave well under load. Under v0.6.7, the HTTP client uses a lot more memory, as measured by the OS RSS value. Leaving them running overnight results in requests that had been taking 40-50ms now taking upwards of several seconds.
Here's a graph comparing RSS between 0.4.9 and 0.6.7 under the same workload, which was an apachebench run:
https://skitch.com/dapsays/g2mj9/memory-usage
Here's the code I'm using:
https://gist.github.com/55d92fc6cf70e84bd154
As described in that gist, I've found that by adding a call to "subresponse.destroy()" to destroy the client HTTP response object in its "end" event handler, memory usage is much closer to normal (though still leaks a bit).
Without the subresponse.destroy() call, it looks like this:
https://skitch.com/dapsays/g37w5/node.js-0.6.7-memory-usage-without-fix-applied
With subresponse.destroy(), it looks like this:
https://skitch.com/dapsays/g37w6/node.js-memory-usage-with-subres
(Note that the scales are different, too. The second graph has a much smaller y-axis.)
The workload is this:
And the actual numbers:
I don't have the exact numbers from 0.4.9, but I wasn't seeing any leakage at all.
This looks like #1968, but these data show that it's not likely heap fragmentation since destroying the responses causes the memory to get cleaned up.
The text was updated successfully, but these errors were encountered: