New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

howto avoid - (node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit. #311

Closed
framlin opened this Issue Aug 28, 2012 · 59 comments

Comments

Projects
None yet
@framlin

framlin commented Aug 28, 2012

I get a TRACE that tells me, that I should increase the listener-limit.
How can I call emitter.setMaxListeners() for request or how can I avoid the problem?

@mikeal

This comment has been minimized.

Member

mikeal commented Aug 29, 2012

are you sure this is an issue with request?

@EnvolveTech1

This comment has been minimized.

EnvolveTech1 commented Sep 19, 2012

It happens when max-redirects gets exceeded. Another trace here:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:168:15)
at Request.EventEmitter.once (events.js:189:8)
at Request.init (/app/node_modules/request/main.js:334:8)
at ClientRequest. (/app/node_modules/request/main.js:589:12)
at ClientRequest.g (events.js:185:14)
at ClientRequest.EventEmitter.emit (events.js:88:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:402:27

@mikeal

This comment has been minimized.

Member

mikeal commented Sep 19, 2012

somewhere you've got a bug. this is a guard in core.

@EnvolveTech1

This comment has been minimized.

EnvolveTech1 commented Sep 20, 2012

I can force this to happen by using jar:false when redirects exceed the limit.

var request = require('request');

// This URL causes max redirect to be exceeded - check it out on http://www.rexswain.com/httpview.html
var testUrl = 'http://www.nytimes.com/2012/09/19/us/politics/in-leaked-video-romney-says-middle-east-peace-process-likely-to-remain-unsolved-problem.html?hp';

request({uri: testUrl, jar: false}, function(error, response, body) {
    if (!error && response.statusCode == 200) {
        console.log(body);    
    }
})

If I run the code above I get this trace:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:168:15)
at Request.EventEmitter.once (events.js:189:8)
at Request.init (C:\Users\Test\Documents\node_modules\request\main.js:334:8)
at ClientRequest. (C:\Users\Test\Documents\node_modules\request\main.js:589:12)
at ClientRequest.g (events.js:185:14)
at ClientRequest.EventEmitter.emit (events.js:88:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:402:27)
(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:168:15)
at Request.start (C:\Users\Test\Documents\node_modules\request\main.js:687:8)
at Request.end (C:\Users\Test\Documents\node_modules\request\main.js:927:28)
at Request.init (C:\Users\Test\Documents\node_modules\request\main.js:381:12)
at process.startup.processNextTick.process._tickCallback (node.js:244:9)

If I take out jar:false or use a custom jar it works fine!
I'm using node v0.8.6 and request v2.11.4

@j0ni

This comment has been minimized.

j0ni commented Oct 10, 2012

I'm seeing this on Heroku but not locally too. The trace is slightly different.

2012-10-10T04:54:42+00:00 app[runner.1]: (node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
2012-10-10T04:54:42+00:00 app[runner.1]: Trace
2012-10-10T04:54:42+00:00 app[runner.1]:     at Socket.EventEmitter.addListener (events.js:176:15)
2012-10-10T04:54:42+00:00 app[runner.1]:     at Socket.EventEmitter.once (events.js:197:8)
2012-10-10T04:54:42+00:00 app[runner.1]:     at ClientRequest.<anonymous> (/app/node_modules/request/main.js:518:27)
2012-10-10T04:54:42+00:00 app[runner.1]:     at ClientRequest.g (events.js:193:14)
2012-10-10T04:54:42+00:00 app[runner.1]:     at ClientRequest.EventEmitter.emit (events.js:93:17)
2012-10-10T04:54:42+00:00 app[runner.1]:     at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1461:7)
2012-10-10T04:54:42+00:00 app[runner.1]:     at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:111:23)
2012-10-10T04:54:42+00:00 app[runner.1]:     at Socket.socketOnData [as ondata] (http.js:1366:20)
2012-10-10T04:54:42+00:00 app[runner.1]:     at TCP.onread (net.js:403:27)

This error is pretty unhelpful :(

This is using v0.8.11 and v2.11.4.

@mikeal

This comment has been minimized.

Member

mikeal commented Oct 10, 2012

this is strange.

that line adds an error listener to the connection object if there isn't one already added, which means we aren't adding a listener multiple times.

on this line:

https://github.com/mikeal/request/blob/master/main.js#L688

we remove that listener, but we check for it on the req instead of the response, which should be the same object but it's a strange inconsistency.

we don't add the listener until the response callback, which means when there are many request instances we shouldn't be stacking up error listeners on pending connection objects.

my guess, is that the leak is actually somewhere else, that many error listeners are getting attached to the connection object, which is pooled and only gets destroyed if there are no pending requests, and that this happens to be the next line to add a listener. basically, we need to find out what the other 10 listeners are.

@mikeal

This comment has been minimized.

Member

mikeal commented Oct 10, 2012

@donkeir your issue is different. yours has to do with many, many, redirects and the fact that we don't cleanup the pipe listener.

@EnvolveTech1

This comment has been minimized.

EnvolveTech1 commented Oct 10, 2012

Yep, thanks Mikeal. I'd had a quick look but couldn't see any obvious cause. The multiple redirects / jar issue isn't a big problem for me as I'm using a custom jar which stops it happening.

@j0ni

This comment has been minimized.

j0ni commented Oct 10, 2012

Thanks @mikeal - so I have managed to make this go away, though I don't understand why.

A URL which was causing this to happen on Heroku is:

http://www.weather.gov/forecasts/xml/sample_products/browser_interface/ndfdBrowserClientByDay.php?lat=33.752886&lon=-116.055617&format=24+hourly&numDays=4&startDate=2012-10-10T14:12:36.205Z

This actually returns a permanent redirect, and when I changed the URL to match the new form the errors went away. No redirects, no issue I guess.

@j0ni

This comment has been minimized.

j0ni commented Oct 10, 2012

BTW @mikeal if you're still interested in getting to the bottom of this, can you tell me how I inspect the collection of listeners attached to the connection object from outside the module? I can't instrument the module itself on Heroku I don't think (which is where this is happening).

@mikeal

This comment has been minimized.

Member

mikeal commented Oct 10, 2012

i would definitely like to know. you can get an array of listeners by doing connection.listeners('error')

@jondot

This comment has been minimized.

jondot commented Nov 9, 2012

@mikeal, and rest I think I got this solved.
First, you can reproduce this easily against http://news.ycombinator.com

Using wireshark I could see the actual host headers are lowercase host, which causes the webserver on the other end to answer with a 301.

Once I set to 'Host' everything works OK.

Will offer a pull later on

jondot added a commit to jondot/request that referenced this issue Nov 9, 2012

@mikeal

This comment has been minimized.

Member

mikeal commented Nov 9, 2012

i'm not changing this default because one server is broken, and it is broken.

the node convention has been lower case headers, when people check headers they assume they are lowercase (not the best practice but it has to be acknowledged)

the real fix for this is to do the header check caseless, and not to add a new Host header if one is already in uppercase, so if your server is broken you can set this header yourself and we'll use it.

also, is this redirect loop really neverending? why isn't the redirect threshold getting hit?

@jondot

This comment has been minimized.

jondot commented Nov 9, 2012

Looking at the RFC syntax I assumed in the specific case of Host, it overrides the general notion of "header field names are case-insensitive', but if you're saying it's not so, then I understand.

At least this spills some more light on the problem.

The loop is a never ending one, yes (host: .. => redirect 301 => host ... => redirect 301 and so on). The threshold indeed gets hit and stops it. Also, because this is a never ending loop, it should make no sense increasing the threshold.

Thanks

@chuyeow

This comment has been minimized.

chuyeow commented Nov 12, 2012

I came to the same conclusion as @jondot in #346.

I'd say it's worthwhile being "nice" and either:

  1. always send a capitalized Host header, or
  2. allow developers to set a capitalized Host header like so: request({ headers: { Host: 'www.example.com' } }) - right now this results in 2 Host header entries like so: headers: { host: 'www.example.com', Host: 'www.example.com' }

The problem with request is that it always sends a lowercase host header even if you explicitly set a capitalized Host header. That makes it unusable in cases of "bad" servers like news.ycombinator.com.

@mikeal

This comment has been minimized.

Member

mikeal commented Nov 16, 2012

@chuyeow yes, i'd like a patch that checks if the host header is already sent as an option and maintains the casing. i don't want to change the default to "Host" though because when other people check for that header in node they tend to do it in lowercase.

@rodw

This comment has been minimized.

rodw commented Jan 25, 2013

@mikeal, I can put together a pull request for this, but before I do I wanted to confirm what you're looking for.

It looks like there is already logic in place to add a host header if and only if there isn't one already. That is, line 184 of main.js (version 2.12.0) reads:

if (!self.headers.host) { 

(and is followed by a block that computes and adds the appropriate host header.)

Is it sufficient to add a check for Host as well as host here, i.e.,

if (!self.headers.host && !self.headers.Host) { 

or are you looking for something more robust/flexible/complicated?

Looking at the source a little more closely, I note that you're already doing something similar (checking for both down-case and Title-Case headers) in a few other places (e.g., with Content-Length at lines 293 and 513, Content-Type at lines 340 and 350).

Frankly this is really working around a bug (or at least uncommonly inflexible interpretation of the HTTP spec, although section 4.2 of RFC-2616 seems to be saying header names are case insensitive) on the server side and while there may be others, news.ycombinator.com is the only example I know of in the wild. But this is a relatively unobtrusive work-around (and news.yc is a relatively popular site: Alexa puts it among the top 2000 in the US, top 3000 world-wide).

(A consistent but substantially more complicated approach might be for node-request to treat all headers as case-insensitive and preserve the case of any headers set by the caller. But given that the header names are used as keys in a map, that would be a substantial change. You'd want something semantically equivalent to headers[/host/i] (rather than headers['host']). It's certainly doable but would require a lot more change to the existing code.)

@rodw

This comment has been minimized.

rodw commented Jan 25, 2013

Also, independent of the Host vs. host issue, has this actually uncovered a minor listener leak? The node.js error reports that we've added more than 10 listeners to the same EventEmitter instance (I think), in this case the HTTP response (I think). I suspect one listener is being added each time we follow a redirect. Should that be happening? Should some existing listener be removed when the request is being re-initialized around lines 554-596?

@shemigon

This comment has been minimized.

shemigon commented Mar 26, 2013

Keep getting the error.

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:175:15)
at Request.EventEmitter.once (events.js:196:8)
at Request.init (/Users/boris/projects/clutchretail/bta/pricetracker/node_modules/request/index.js:376:8)
at Request.onResponse (/Users/boris/projects/clutchretail/bta/pricetracker/node_modules/request/index.js:780:10)
at ClientRequest.g (events.js:192:14)
at ClientRequest.EventEmitter.emit (events.js:96:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:404:27)

This is the latest npm version.

@garbados

This comment has been minimized.

garbados commented Apr 3, 2013

Also getting this error. Stacktrace:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:175:15)
at Request.EventEmitter.once (events.js:196:8)
at Request.init (/Users/deus/code/twitter-contentify/node_modules/request/index.js:376:8)
at Request.onResponse (/Users/deus/code/twitter-contentify/node_modules/request/index.js:780:10)
at ClientRequest.g (events.js:192:14)
at ClientRequest.EventEmitter.emit (events.js:96:17)
at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1527:7)
at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:111:23)
at Socket.socketOnData [as ondata] (http.js:1430:20)
at TCP.onread (net.js:404:27)

@jbrumwell

This comment has been minimized.

jbrumwell commented Apr 5, 2013

+1

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:175:15)
at Request.EventEmitter.once (events.js:196:8)
at Request.init (D:\Server\htdocs\fmcsaCrawler\node_modules\request\index.js:376:8)
at Request.onResponse (D:\Server\htdocs\fmcsaCrawler\node_modules\request\index.js:780:10)
at ClientRequest.g (events.js:192:14)
at ClientRequest.EventEmitter.emit (events.js:96:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:404:27)

@evandrix

This comment has been minimized.

evandrix commented Apr 10, 2013

+1

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:175:15)
at Request.EventEmitter.once (events.js:196:8)
at Request.init (D:\Server\htdocs\fmcsaCrawler\node_modules\request\index.js:376:8)
at Request.onResponse (D:\Server\htdocs\fmcsaCrawler\node_modules\request\index.js:780:10)
at ClientRequest.g (events.js:192:14)
at ClientRequest.EventEmitter.emit (events.js:96:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:404:27)
@jonstorer

This comment has been minimized.

jonstorer commented Apr 13, 2013

+1

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at EventEmitter.addListener (events.js:160:15)
    at EventEmitter.Parser.setup (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/html5/lib/html5/parser.js:2459:17)
    at EventEmitter.Parser.parse_fragment (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/html5/lib/html5/parser.js:2399:8)
    at HtmlToDom.appendHtmlToElement (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/browser/htmltodom.js:95:13)
    at Object.innerHTML (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/browser/index.js:460:17)
    at /javascripts/jquery.min.js:4:6719
    at at (/javascripts/jquery.min.js:4:5136)
    at st.setDocument (/javascripts/jquery.min.js:4:6686)
    at at (/javascripts/jquery.min.js:4:20427)
    at /javascripts/jquery.min.js:4:20587
    at /javascripts/jquery.min.js:5:28399
    at Contextify.sandbox.run (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/node_modules/contextify/lib/contextify.js:12:24)
    at window._evaluate (/Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/windows.js:271:25)
    at Object.HTML.languageProcessors.javascript (/Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/scripts.js:20:21)
    at Object.define.proto._eval (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/level2/html.js:1295:47)
    at Object.loaded (/Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/jsdom_patches.js:134:27)
    at /Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/level2/html.js:51:20
    at Object.item.check (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/level2/html.js:280:11)
    at /Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/level2/html.js:298:12
    at /Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/resources.js:147:16
    at Request._callback (/Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/resources.js:335:16)
    at Request.self.callback (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/request/main.js:120:22)
    at Request.EventEmitter.emit (events.js:98:17)
@lbertenasco

This comment has been minimized.

lbertenasco commented Apr 30, 2013

BUMP

Does anyone knows hot to fix this problem?
I've been dealing with this for some days and the only solution i found was to search and replace all the
request."whatever" or request(url)
with
request({ uri: url , headers: { "User-Agent": "Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.31 (KHTML, like Gecko) Chrome/26.0.1410."}}, func...

I really hope it helps, do a find in folder with sublime or any other and its quite easy, you can try to modify the options as you want, it works like a charm.

@rodw

This comment has been minimized.

rodw commented Apr 30, 2013

@lbertenasco: The "too many listeners" error message is a bit of a red herring. The underlying issue is that you are stuck in a redirect loop.

That is, the server doesn't like something about your request, and is responding with a redirect message (generally an HTTP 301 or 302 status). Your client resubmits the request, but didn't change the thing that the server didn't like, and so receives another redirect response. After 10 cycles through that loop, you hit the "too many listeners" error.

One could bump up the number of listeners that are allowed (as suggested in the error message), but that will only delay the inevitable. After N cycles through the loop, you will hit the "too many listeners" error, for any value of N.

One could (and I'll assert, should) change the code of the request module so that it no longer adds a new listener every time a redirect is followed, but you'll just be replacing one error condition with another. You'll either hit some other natural limit (e.g., the depth of the calling stack) or your code will "hang" in an endless loop (just like 10 GOTO 10 would in Basic).

Ultimately what you need to do is address the redirect loop.

In your case, it looks like the server you are hitting is sending a 301 or 302 response because it doesn't recognize the web browser (user-agent) you're using. Explicitly setting the user-agent header to look like Google Chrome fixes this problem.

Without seeing the actual HTTP response, it is hard to tell if this is an error on the server-side or the client-side.

For instance, if the server is responding with 302 message that redirects the client to the same URL, but expects to receive a user-agent string the second time, this is a mistake on the servers' part.

Alternatively, if the server is responding with a 302 message that redirects the client to some sort of "your browser is not supported" page, but the client (the request module) is resubmitting the request to the original URL, then the bug is on the client's side.

Similarly, in the HackerNews example cited above, the server is looking for a Host header but request is only sending a host header. This causes a redirect loop unless steps are taken to manually add a Host, rather than host, header. (In this case, I believe the news.ycombinator.com server is in error--it should treat header names as case-insensitive--but I presume it is easier to add a little work-around on the client side than to convince Paul Graham to update the news.yc server.)

TL/DR: The request library could handle this a bit better, but at the end of the day you'll need to figure out why you've encountered a redirect loop, and find a way to work around it.

@nickminutello

This comment has been minimized.

nickminutello commented Jul 3, 2013

+1

We are seeing this same error - but strangely only in 1 of our 4 environments. And not all the time.
We are seeing some as-yet-unexplained performance problems in this environment - not sure yet which is cause and symptom...

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at Socket.EventEmitter.addListener (events.js:175:15)
    at Socket.EventEmitter.once (events.js:196:8)
    at ClientRequest.<anonymous> (/pathtoourapp/2.2.58.2/server/node_modules/request/main.js:561:27)
    at ClientRequest.g (events.js:192:14)
    at ClientRequest.EventEmitter.emit (events.js:96:17)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1582:7)
    at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:111:23)
    at Socket.socketOnData [as ondata] (http.js:1485:20)
    at TCP.onread (net.js:404:27)

@nickminutello

This comment has been minimized.

nickminutello commented Jul 3, 2013

Looking a bit closer, we don't appear to have any problem with retry-loops.
What I do see is that the number of listeners for a given connection can get into the hundreds - all handling different (get) requests. Our system does get a bit busy making http requests - the queued requests can get into the 70s - but behaviour is not what I was expecting...

@dutchakdev

This comment has been minimized.

dutchakdev commented May 1, 2015

+1
(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace:
at Request.EventEmitter.addListener (events.js:160:15)
at Request.init (/home/ubuntu/workspace/megatalog-crawler/node_modules/request/request.js:667:8)
at Redirect.onResponse (/home/ubuntu/workspace/megatalog-crawler/node_modules/request/lib/redirect.js:149:11)
at Request.onRequestResponse (/home/ubuntu/workspace/megatalog-crawler/node_modules/request/request.js:1108:22)
at ClientRequest.EventEmitter.emit (events.js:95:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:527:27)

@afshinm

This comment has been minimized.

afshinm commented May 11, 2015

Same issue:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.addListener (events.js:160:15)
at Request.init (/var/www/it/src/node_modules/request/request.js:667:8)
at Redirect.onResponse (/var/www/it/src/node_modules/request/lib/redirect.js:149:11)
at Request.onRequestResponse (/var/www/it/src/node_modules/request/request.js:1108:22)
at ClientRequest.emit (events.js:95:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:527:27)

@jamlen

This comment has been minimized.

jamlen commented May 21, 2015

👍 and I'm using v0.12.2

@elliotbonneville

This comment has been minimized.

elliotbonneville commented May 28, 2015

Having the same issue.

@robbyoconnor robbyoconnor referenced this issue May 30, 2015

Closed

ID-105 #5

robbyoconnor added a commit to openmrs/openmrs-contrib-id-sso that referenced this issue May 30, 2015

robbyoconnor added a commit to openmrs/openmrs-contrib-id-sso that referenced this issue May 30, 2015

@shotlom

This comment has been minimized.

shotlom commented Jun 11, 2015

I'm getting this issue also when using request with .pipe()

happy to increase the number of listeners, but can I do that if I'm using pipes?

@shotlom

This comment has been minimized.

shotlom commented Jun 16, 2015

@mikeal 2 yrs ago you mentioned, "we don't cleanup the pipe listener", do you have an example of how I can cleanup the pipe listener manually?

I'm piping the return to a number of data cleansers and tasks but as soon as I hit 11 request.get() calls my script falls over.

[someArray of urls]

for ( i < someArray.length){
  request.get(someUri)
  .pipe(someCleansingFunction)
 .pipe(someSavingFunction)
 .how( do I cleanup the pipe listener now that I've finished cleansing & saving?)
}

juliankrispel added a commit to juliankrispel/static-proxy that referenced this issue Aug 3, 2015

@jefersonbentes

This comment has been minimized.

jefersonbentes commented Aug 25, 2015

+1

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at PoolConnection.EventEmitter.addListener (events.js:160:15)
at /home/jeferson/workspacejs/syncserver/src/prod/dbconnection.js:32:20
at Ping.onOperationComplete as _callback
at Ping.Sequence.end (/home/jeferson/workspacejs/syncserver/node_modules/mysql/lib/protocol/sequences/Sequence.js:96:24)
at Ping.Sequence.OkPacket (/home/jeferson/workspacejs/syncserver/node_modules/mysql/lib/protocol/sequences/Sequence.js:105:8)
at Protocol._parsePacket (/home/jeferson/workspacejs/syncserver/node_modules/mysql/lib/protocol/Protocol.js:274:23)
at Parser.write (/home/jeferson/workspacejs/syncserver/node_modules/mysql/lib/protocol/Parser.js:77:12)
at Protocol.write (/home/jeferson/workspacejs/syncserver/node_modules/mysql/lib/protocol/Protocol.js:39:16)
at Socket. (/home/jeferson/workspacejs/syncserver/node_modules/mysql/lib/Connection.js:96:28)
at Socket.EventEmitter.emit (events.js:95:17)

@RathaR

This comment has been minimized.

RathaR commented Aug 25, 2015

+1, I have the same problem with pipe...

@travischoma

This comment has been minimized.

travischoma commented Aug 31, 2015

+1, happens often with nytimes.com links for me. Is there a work around for this bug?

Aug 31 11:14:41 x app/web.1: (node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Aug 31 11:14:41 x app/web.1: Trace
Aug 31 11:14:41 x app/web.1: at Request.addListener (events.js:160:15)
Aug 31 11:14:41 x app/web.1: at Request.start (/app/node_modules/request/request.js:947:8)
Aug 31 11:14:41 x app/web.1: at Request.end (/app/node_modules/request/request.js:1732:10)
Aug 31 11:14:41 x app/web.1: at end (/app/node_modules/request/request.js:704:14)
Aug 31 11:14:41 x app/web.1: at Object._onImmediate (/app/node_modules/request/request.js:718:7)
Aug 31 11:14:41 x app/web.1: at processImmediate as _immediateCallback
Aug 31 11:14:41 x app/web.1: [Error: Exceeded maxRedirects. Probably stuck in a redirect loop http://www.nytimes.com/2015/06/28/magazine/confessions-of-a-seduction-addict.html?_r=4]

@ahildoer

This comment has been minimized.

ahildoer commented Sep 25, 2015

Hi @mikeal,

This still happens with just basic GET requests, no pipes, no magic fairy dust, etc.

Regardless of what server we talk to, and what issues that server has, it should not cause a problem in our client code. If we connect to a server and it streams back random binary, and nothing of HTTP protocol at all, I would expect that the err response be populated. We should not be kicking off error messages and warnings. Also, ignoring the leak with maxListeners(0) is not a solution. Its pretending the problem doesn't exist as our servers slowly crash.

What about this... what if we just set maxListeners(maxRedirects + 1)? It seems as though everyone reporting this issues thinks its a redirect related problem. And, if I set maxRedirects to < 10, it never happens. If its left at default of 10, always happens. Just a thought.

@aggied

This comment has been minimized.

aggied commented Oct 4, 2015

+1 @travischoma same here. @ahildoer agree. We really should be able to find a way to talk to almost every server out there.

@lawrips

This comment has been minimized.

lawrips commented Oct 11, 2015

Not sure if it's the same issue that everyone else is facing here but this state is reproable (at least on Node 0.12.6) by:

  1. Setting http.globalAgent.maxSockets = 1 (or a similarly low #)
  2. Making lots of requests to the same server

What i see is 1 socket connection (or however many you set in #1) and then as requests get queued, this exception will eventually get triggered.

My case is somewhat artificial as I'm setting maxSockets to be low. But i believe that pre node 0.12, the default size was 5 and in theory, this error could occur whenever there are significantly more requests than available sockets?

@Psopho

This comment has been minimized.

Psopho commented Nov 3, 2015

I've overcome this by setting maxRedirects:5 which raises an error correctly and doesn't crash and burn

@anmolgupta

This comment has been minimized.

anmolgupta commented Jan 19, 2016

I have this same issue when I am trying to implement SSE with nodejs.

(node) warning: possible EventEmitter memory leak detected. 11 notify listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at EventEmitter.addListener (events.js:179:15)

@nanvel

This comment has been minimized.

nanvel commented Sep 2, 2016

removeAllListeners('data') looks like fixed the issue for me.
Be careful, I am not sure it has no side effects.

function getPage(pageURL) {
  return new Promise(function(resolve, reject) {
    let chunks = []
    let reqPipe = request({
      uri: pageURL,
      method: 'GET'
    }).pipe(iconv)

    reqPipe.on('data', function(chunk) {
      chunks.push(chunk)
    }).once('end', function() {
      var str = Buffer.concat(chunks).toString()
      reqPipe.removeAllListeners('data')
      resolve(str)
    })
  })
}
@justechn

This comment has been minimized.

justechn commented Sep 12, 2016

I am doing an npm install with a local repository and getting a lot of these warnings. What is the solution?

npm install -g git+http://stash/scm/ss/generator-lambda.git

(node) warning: possible EventEmitter memory leak detected. 11 error listeners added. Use emitter.setMaxListeners() to increase limit. Trace at TLSSocket.addListener (events.js:239:17) at TLSSocket.Readable.on (_stream_readable.js:680:33) at Request.<anonymous> (C:\Users\rmclaughlin\AppData\Roaming\npm\node_modules\npm\node_modules\npm-registry-client\lib\request.js:153:7) at emitOne (events.js:77:13) at Request.emit (events.js:169:7) at ClientRequest.<anonymous> (C:\Users\rmclaughlin\AppData\Roaming\npm\node_modules\npm\node_modules\request\request.js:791:10) at emitOne (events.js:82:20) at ClientRequest.emit (events.js:169:7) at tickOnSocket (_http_client.js:523:7) at onSocketNT (_http_client.js:535:5)

I am using npm ver 3.10.7

@fishcharlie

This comment has been minimized.

fishcharlie commented Mar 11, 2017

I'm having this same problem with nytimes.com. Was anyone able to fix this?

@smelted-code

This comment has been minimized.

smelted-code commented Jun 1, 2017

@fishcharlie Same issue here with NYT. Lowering the number of redirects (like @Psopho and @chmac suggest) seems like a viable workaround.

@shiftie

This comment has been minimized.

shiftie commented Sep 5, 2017

Adding {maxRedirects: 5} option to the function call prevents the memory leak issue & instead populates the err argument in the callback function.

@ttback

This comment has been minimized.

ttback commented Feb 2, 2018

Didn't know this was still relevant issue in 2017. I made a library to try to have it working with request 4 years ago. You can try to see if it helps.

@Zearin

This comment has been minimized.

Contributor

Zearin commented Feb 2, 2018

Didn't know this was still relevant issue in 2017

It’s 2018 😋

xPaw added a commit to thelounge/thelounge that referenced this issue Mar 7, 2018

Remove setMaxListeners
Reverts 2cee0ea as this no longer causes the EventEmitter warning due to `maxRedirects` being set to 5 on our end.

Ref: request/request#311 (comment)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment