Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

Avoid 'EventEmitter memory leak' warnings. #11

Merged
merged 3 commits into from

5 participants

@sethml

Fixed problem: When a keep-alive connection has many requests, each request can result in a separate bounce, and all of those bounces getting piped back into the request stream can result in ugly warnings on stdout:

  (node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
  Trace:
      at Socket.<anonymous> (events.js:126:17)
      at Socket.pipe (stream.js:156:8)
      at /Users/seth/development/service/node_modules/bouncy/index.js:86:20
      at /Users/seth/development/service/scalar.js:52:5
      at IncomingMessage.onHeaders (/Users/seth/development/service/node_modules/bouncy/index.js:46:13)
      at IncomingMessage.emit (events.js:64:17)
      at Parser.<anonymous> (/Users/seth/development/service/node_modules/bouncy/node_modules/parsley/lib/modes.js:135:21)
      at Parser.execute (/Users/seth/development/service/node_modules/bouncy/node_modules/parsley/index.js:20:35)
      at Socket.<anonymous> (/Users/seth/development/service/node_modules/bouncy/node_modules/parsley/index.js:11:14)
      at Socket.emit (events.js:64:17)
sethml added some commits
@sethml sethml Ignore node_modules. b919426
@sethml sethml Fix test require paths. 72a51e2
@sethml sethml Avoid 'EventEmitter memory leak' warnings.
Fixed problem: When a keep-alive connection has many requests, each request can result in a separate bounce, and all of those bounces getting piped back into the request stream can result in ugly warnings on stdout:
  (node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
  Trace:
      at Socket.<anonymous> (events.js:126:17)
      at Socket.pipe (stream.js:156:8)
      at /Users/seth/development/service/node_modules/bouncy/index.js:86:20
      at /Users/seth/development/service/scalar.js:52:5
      at IncomingMessage.onHeaders (/Users/seth/development/service/node_modules/bouncy/index.js:46:13)
      at IncomingMessage.emit (events.js:64:17)
      at Parser.<anonymous> (/Users/seth/development/service/node_modules/bouncy/node_modules/parsley/lib/modes.js:135:21)
      at Parser.execute (/Users/seth/development/service/node_modules/bouncy/node_modules/parsley/index.js:20:35)
      at Socket.<anonymous> (/Users/seth/development/service/node_modules/bouncy/node_modules/parsley/index.js:11:14)
      at Socket.emit (events.js:64:17)
759e365
@Marak

This won't fix the problem, it will only result in you leaking a shit ton of memory until your proxy slowly becomes unresponsive.

@sethml

Yeah, see #10 for a more "real" fix. Unfortunately, I think #10 isn't quite the right approach, but it's getting there. This is at least a quick fix.

I don't think the problems is as bad as you imply. There's no long-term leak. For each browser connection, backend connections will pile up with each request, but then all go away when the browser disconnects.

@Marak

Yeah, it's not like we are proxying over 1 million HTTP requests per day through node or anything.

@sethml

Snark much?

Seriously, you can avoid the "leaking" connections if you simply issue your bounces like so:

bounce(destPort, { headers: { Connection: 'close' } });

If you have suggestions for a better solution, please propose it.

@Marak

If you have suggestions for a better solution, please propose it.

I'd start by looking here: https://github.com/nodejitsu/node-http-proxy/blob/master/lib/node-http-proxy.js#L351, unless you think you can write better HTTP code then node's core team.

@shripadk

@Marak: to be quite honest I couldn't get more than 200 concurrent websocket connections on node-http-proxy. Would start dropping as soon as it reached 212 connections... :( with bouncy I could go upto 48.5K active connections without any connections dropping.

Tested this today on node v0.4.12.

@Marak

That makes sense. We ship the proxy with 100 max sockets by default https://github.com/nodejitsu/node-http-proxy/blob/master/lib/node-http-proxy.js#L31 since in most cases exceeding this generally means you did something wrong.

What tools are you using for testing that type of load? Is it from real traffic or a distributed testing framework ( like tsung )? Is it just local AB?

@shripadk

True. I even changed the max sockets to a really high number. Still it was stuck at 200 odd. And I'm not using tsung or AB for this. I wrote a simple custom script that automates clients connections and just keeps them open. You can see the gist here: https://gist.github.com/1406702

I'm using sockjs for the websocket server and websocket module for automating the client connections. This was an experiment to see how many connections I can keep open.

@shripadk

Oh and it uses bouncy as a proxy running on port 8888. I tried the same with node-http-proxy but couldn't go past 200 active connections.

@Marak

It's impossible to say with your testing scenario. You can't get accurate results using the same machine / host for client and server. You also might have some something misconfigured or your testing code might not test what you think it does. Too many unknown variables.

If you want to understand the cause of the issue, you can try opening up a support issue on the http-proxy repo.

@shripadk

Well I'm already looking into the source to write a failing test case. The idea was quite simple. Just maintain open connections. I'm using the example shipped with node-http-proxy with the slight difference being sockjs instead of socket.io: https://github.com/nodejitsu/node-http-proxy/blob/master/examples/websocket/standalone-websocket-proxy.js

And if I don't use any proxy I still manage to get 48.5K open connections to a node server. But i'll definitely check if I have misconfigured something. (very unlikely) And it stays at 48.5K open connections because the max ephemeral ports allowed per client is 64K. Apart from that I don't see anything wrong with the testing scenario. You can throttle it though and will eventually end up with 48.5K anyways.

And yes I'll open a support issue on the http-proxy repo. :)

@shripadk

also tsung/ab are not ideally suited for testing websocket/comet connections though. :(

@joemccann

I see this is not a new issue, but the same is happening for me.

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace: 
    at Socket.<anonymous> (events.js:133:17)
    at Socket.pipe (stream.js:52:8)
    at /usr/local/lib/node_modules/bouncy/index.js:85:20
    at /usr/local/lib/node_modules/bouncy/bin/bouncy.js:33:9
    at IncomingMessage.onHeaders (/usr/local/lib/node_modules/bouncy/index.js:46:13)
    at IncomingMessage.emit (events.js:67:17)
    at Parser.<anonymous> (/usr/local/lib/node_modules/bouncy/node_modules/parsley/lib/modes.js:135:21)
    at Parser.execute (/usr/local/lib/node_modules/bouncy/node_modules/parsley/index.js:20:35)
    at Socket.<anonymous> (/usr/local/lib/node_modules/bouncy/node_modules/parsley/index.js:11:14)
    at Socket.emit (events.js:67:17)
@substack
Owner

Is this patch still necessary after #12 landed in 1.0.3?

@shripadk

Yes it is necessary. #12 ensures that the stream is closed when the socket is closed. This is more of a keep-alive problem ( the socket 'close' event is never triggered).

@substack substack merged commit 759e365 into from
@shripadk

Oh wait... this doesn't fix the actual problem. You'll have to merge #10 to fix this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Commits on Nov 11, 2011
  1. @sethml

    Ignore node_modules.

    sethml authored
  2. @sethml

    Fix test require paths.

    sethml authored
  3. @sethml

    Avoid 'EventEmitter memory leak' warnings.

    sethml authored
    Fixed problem: When a keep-alive connection has many requests, each request can result in a separate bounce, and all of those bounces getting piped back into the request stream can result in ugly warnings on stdout:
      (node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
      Trace:
          at Socket.<anonymous> (events.js:126:17)
          at Socket.pipe (stream.js:156:8)
          at /Users/seth/development/service/node_modules/bouncy/index.js:86:20
          at /Users/seth/development/service/scalar.js:52:5
          at IncomingMessage.onHeaders (/Users/seth/development/service/node_modules/bouncy/index.js:46:13)
          at IncomingMessage.emit (events.js:64:17)
          at Parser.<anonymous> (/Users/seth/development/service/node_modules/bouncy/node_modules/parsley/lib/modes.js:135:21)
          at Parser.execute (/Users/seth/development/service/node_modules/bouncy/node_modules/parsley/index.js:20:35)
          at Socket.<anonymous> (/Users/seth/development/service/node_modules/bouncy/node_modules/parsley/index.js:11:14)
          at Socket.emit (events.js:64:17)
This page is out of date. Refresh to see the latest.
View
1  .gitignore
@@ -0,0 +1 @@
+node_modules
View
2  index.js
@@ -25,6 +25,8 @@ var bouncy = module.exports = function (opts, cb) {
var handler = bouncy.handler = function (cb, c) {
parsley(c, function (req) {
+ c.setMaxListeners(0);
+
var stream = new BufferedStream;
stream.pause();
View
2  test/parse_error.js
@@ -1,5 +1,5 @@
var test = require('tap').test;
-var bouncy = require('bouncy');
+var bouncy = require('../');
var net = require('net');
test('parse error', function (t) {
View
2  test/post.js
@@ -1,5 +1,5 @@
var test = require('tap').test;
-var bouncy = require('bouncy');
+var bouncy = require('../');
var http = require('http');
var Stream = require('net').Stream;
View
2  test/raw_destroy.js
@@ -1,5 +1,5 @@
var test = require('tap').test;
-var bouncy = require('bouncy');
+var bouncy = require('../');
var net = require('net');
var EventEmitter = require('events').EventEmitter;
View
2  test/raw_drop.js
@@ -1,5 +1,5 @@
var test = require('tap').test;
-var bouncy = require('bouncy');
+var bouncy = require('../');
var net = require('net');
var EventEmitter = require('events').EventEmitter;
View
2  test/raw_fail.js
@@ -1,5 +1,5 @@
var test = require('tap').test;
-var bouncy = require('bouncy');
+var bouncy = require('../');
var net = require('net');
test('raw without a host', function (t) {
View
2  test/raw_ok.js
@@ -1,5 +1,5 @@
var test = require('tap').test;
-var bouncy = require('bouncy');
+var bouncy = require('../');
var net = require('net');
test('raw with a host', function (t) {
Something went wrong with that request. Please try again.