Http server can't handle large response on high load #5621

zyv5544 opened this Issue Jun 2, 2013 · 2 comments


None yet

2 participants

zyv5544 commented Jun 2, 2013

Tested on an high load server with approximately 500-600 requests per second.
after hours of debugging, I ended up just with a simple HTTP server.

I noticed that when the response body was bigger then, lets say, 60k, I got this error:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
    at Socket.EventEmitter.addListener (events.js:160:15)
    at Socket.Readable.on (_stream_readable.js:679:33)
    at Socket.EventEmitter.once (events.js:179:8)
    at TCP.onread (net.js:527:26)

And after that the CPU went like crazy

But, with the exact same code, when the response set to 10k, everything worked smoothly.

Has anyone encountered this before???
Pleading for help.

This is the full script:

cluster = require('cluster'),
numCPUs = require('os').cpus().length;


    for (var i = 0; i < numCPUs; i++) cluster.fork();

    cluster.on("exit", function(worker, code, signal) {


    var http = require('http');

    var app = function(req, res){

        res.writeHead(200, {'Content-Type': 'text/html', 'Access-Control-Allow-Origin': '*'});
        res.end( 60k_of_text___or___10k_of_text );



zyv5544 commented Jun 2, 2013

BTW, tested on a 16 cores, 16G ram, linux centos 6.4


This should be fixed by #5504. Closing. If this is still an issue for you, please re-open or comment!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment