Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

howto avoid - (node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit. #311

Closed
framlin opened this Issue · 40 comments
@framlin

I get a TRACE that tells me, that I should increase the listener-limit.
How can I call emitter.setMaxListeners() for request or how can I avoid the problem?

@mikeal
Owner

are you sure this is an issue with request?

@donkeir

It happens when max-redirects gets exceeded. Another trace here:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:168:15)
at Request.EventEmitter.once (events.js:189:8)
at Request.init (/app/node_modules/request/main.js:334:8)
at ClientRequest. (/app/node_modules/request/main.js:589:12)
at ClientRequest.g (events.js:185:14)
at ClientRequest.EventEmitter.emit (events.js:88:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:402:27

@mikeal
Owner

somewhere you've got a bug. this is a guard in core.

@donkeir

I can force this to happen by using jar:false when redirects exceed the limit.

var request = require('request');

// This URL causes max redirect to be exceeded - check it out on http://www.rexswain.com/httpview.html
var testUrl = 'http://www.nytimes.com/2012/09/19/us/politics/in-leaked-video-romney-says-middle-east-peace-process-likely-to-remain-unsolved-problem.html?hp';

request({uri: testUrl, jar: false}, function(error, response, body) {
    if (!error && response.statusCode == 200) {
        console.log(body);    
    }
})

If I run the code above I get this trace:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:168:15)
at Request.EventEmitter.once (events.js:189:8)
at Request.init (C:\Users\Test\Documents\node_modules\request\main.js:334:8)
at ClientRequest. (C:\Users\Test\Documents\node_modules\request\main.js:589:12)
at ClientRequest.g (events.js:185:14)
at ClientRequest.EventEmitter.emit (events.js:88:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:402:27)
(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:168:15)
at Request.start (C:\Users\Test\Documents\node_modules\request\main.js:687:8)
at Request.end (C:\Users\Test\Documents\node_modules\request\main.js:927:28)
at Request.init (C:\Users\Test\Documents\node_modules\request\main.js:381:12)
at process.startup.processNextTick.process._tickCallback (node.js:244:9)

If I take out jar:false or use a custom jar it works fine!
I'm using node v0.8.6 and request v2.11.4

@j0ni

I'm seeing this on Heroku but not locally too. The trace is slightly different.

2012-10-10T04:54:42+00:00 app[runner.1]: (node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
2012-10-10T04:54:42+00:00 app[runner.1]: Trace
2012-10-10T04:54:42+00:00 app[runner.1]:     at Socket.EventEmitter.addListener (events.js:176:15)
2012-10-10T04:54:42+00:00 app[runner.1]:     at Socket.EventEmitter.once (events.js:197:8)
2012-10-10T04:54:42+00:00 app[runner.1]:     at ClientRequest.<anonymous> (/app/node_modules/request/main.js:518:27)
2012-10-10T04:54:42+00:00 app[runner.1]:     at ClientRequest.g (events.js:193:14)
2012-10-10T04:54:42+00:00 app[runner.1]:     at ClientRequest.EventEmitter.emit (events.js:93:17)
2012-10-10T04:54:42+00:00 app[runner.1]:     at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1461:7)
2012-10-10T04:54:42+00:00 app[runner.1]:     at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:111:23)
2012-10-10T04:54:42+00:00 app[runner.1]:     at Socket.socketOnData [as ondata] (http.js:1366:20)
2012-10-10T04:54:42+00:00 app[runner.1]:     at TCP.onread (net.js:403:27)

This error is pretty unhelpful :(

This is using v0.8.11 and v2.11.4.

@mikeal
Owner

this is strange.

that line adds an error listener to the connection object if there isn't one already added, which means we aren't adding a listener multiple times.

on this line:

https://github.com/mikeal/request/blob/master/main.js#L688

we remove that listener, but we check for it on the req instead of the response, which should be the same object but it's a strange inconsistency.

we don't add the listener until the response callback, which means when there are many request instances we shouldn't be stacking up error listeners on pending connection objects.

my guess, is that the leak is actually somewhere else, that many error listeners are getting attached to the connection object, which is pooled and only gets destroyed if there are no pending requests, and that this happens to be the next line to add a listener. basically, we need to find out what the other 10 listeners are.

@mikeal
Owner

@donkeir your issue is different. yours has to do with many, many, redirects and the fact that we don't cleanup the pipe listener.

@donkeir

Yep, thanks Mikeal. I'd had a quick look but couldn't see any obvious cause. The multiple redirects / jar issue isn't a big problem for me as I'm using a custom jar which stops it happening.

@j0ni

Thanks @mikeal - so I have managed to make this go away, though I don't understand why.

A URL which was causing this to happen on Heroku is:

http://www.weather.gov/forecasts/xml/sample_products/browser_interface/ndfdBrowserClientByDay.php?lat=33.752886&lon=-116.055617&format=24+hourly&numDays=4&startDate=2012-10-10T14:12:36.205Z

This actually returns a permanent redirect, and when I changed the URL to match the new form the errors went away. No redirects, no issue I guess.

@j0ni

BTW @mikeal if you're still interested in getting to the bottom of this, can you tell me how I inspect the collection of listeners attached to the connection object from outside the module? I can't instrument the module itself on Heroku I don't think (which is where this is happening).

@mikeal
Owner

i would definitely like to know. you can get an array of listeners by doing connection.listeners('error')

@jondot

@mikeal, and rest I think I got this solved.
First, you can reproduce this easily against http://news.ycombinator.com

Using wireshark I could see the actual host headers are lowercase host, which causes the webserver on the other end to answer with a 301.

Once I set to 'Host' everything works OK.

Will offer a pull later on

@jondot jondot referenced this issue from a commit in jondot/request
@jondot jondot fixing #311 5204de3
@mikeal
Owner

i'm not changing this default because one server is broken, and it is broken.

the node convention has been lower case headers, when people check headers they assume they are lowercase (not the best practice but it has to be acknowledged)

the real fix for this is to do the header check caseless, and not to add a new Host header if one is already in uppercase, so if your server is broken you can set this header yourself and we'll use it.

also, is this redirect loop really neverending? why isn't the redirect threshold getting hit?

@jondot

Looking at the RFC syntax I assumed in the specific case of Host, it overrides the general notion of "header field names are case-insensitive', but if you're saying it's not so, then I understand.

At least this spills some more light on the problem.

The loop is a never ending one, yes (host: .. => redirect 301 => host ... => redirect 301 and so on). The threshold indeed gets hit and stops it. Also, because this is a never ending loop, it should make no sense increasing the threshold.

Thanks

@chuyeow

I came to the same conclusion as @jondot in #346.

I'd say it's worthwhile being "nice" and either:

  1. always send a capitalized Host header, or
  2. allow developers to set a capitalized Host header like so: request({ headers: { Host: 'www.example.com' } }) - right now this results in 2 Host header entries like so: headers: { host: 'www.example.com', Host: 'www.example.com' }

The problem with request is that it always sends a lowercase host header even if you explicitly set a capitalized Host header. That makes it unusable in cases of "bad" servers like news.ycombinator.com.

@mikeal
Owner

@chuyeow yes, i'd like a patch that checks if the host header is already sent as an option and maintains the casing. i don't want to change the default to "Host" though because when other people check for that header in node they tend to do it in lowercase.

@rodw

@mikeal, I can put together a pull request for this, but before I do I wanted to confirm what you're looking for.

It looks like there is already logic in place to add a host header if and only if there isn't one already. That is, line 184 of main.js (version 2.12.0) reads:

if (!self.headers.host) { 

(and is followed by a block that computes and adds the appropriate host header.)

Is it sufficient to add a check for Host as well as host here, i.e.,

if (!self.headers.host && !self.headers.Host) { 

or are you looking for something more robust/flexible/complicated?

Looking at the source a little more closely, I note that you're already doing something similar (checking for both down-case and Title-Case headers) in a few other places (e.g., with Content-Length at lines 293 and 513, Content-Type at lines 340 and 350).

Frankly this is really working around a bug (or at least uncommonly inflexible interpretation of the HTTP spec, although section 4.2 of RFC-2616 seems to be saying header names are case insensitive) on the server side and while there may be others, news.ycombinator.com is the only example I know of in the wild. But this is a relatively unobtrusive work-around (and news.yc is a relatively popular site: Alexa puts it among the top 2000 in the US, top 3000 world-wide).

(A consistent but substantially more complicated approach might be for node-request to treat all headers as case-insensitive and preserve the case of any headers set by the caller. But given that the header names are used as keys in a map, that would be a substantial change. You'd want something semantically equivalent to headers[/host/i] (rather than headers['host']). It's certainly doable but would require a lot more change to the existing code.)

@rodw

Also, independent of the Host vs. host issue, has this actually uncovered a minor listener leak? The node.js error reports that we've added more than 10 listeners to the same EventEmitter instance (I think), in this case the HTTP response (I think). I suspect one listener is being added each time we follow a redirect. Should that be happening? Should some existing listener be removed when the request is being re-initialized around lines 554-596?

@shemigon

Keep getting the error.

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:175:15)
at Request.EventEmitter.once (events.js:196:8)
at Request.init (/Users/boris/projects/clutchretail/bta/pricetracker/node_modules/request/index.js:376:8)
at Request.onResponse (/Users/boris/projects/clutchretail/bta/pricetracker/node_modules/request/index.js:780:10)
at ClientRequest.g (events.js:192:14)
at ClientRequest.EventEmitter.emit (events.js:96:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:404:27)

This is the latest npm version.

@garbados

Also getting this error. Stacktrace:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:175:15)
at Request.EventEmitter.once (events.js:196:8)
at Request.init (/Users/deus/code/twitter-contentify/node_modules/request/index.js:376:8)
at Request.onResponse (/Users/deus/code/twitter-contentify/node_modules/request/index.js:780:10)
at ClientRequest.g (events.js:192:14)
at ClientRequest.EventEmitter.emit (events.js:96:17)
at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1527:7)
at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:111:23)
at Socket.socketOnData [as ondata] (http.js:1430:20)
at TCP.onread (net.js:404:27)

@jbrumwell

+1

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:175:15)
at Request.EventEmitter.once (events.js:196:8)
at Request.init (D:\Server\htdocs\fmcsaCrawler\node_modules\request\index.js:376:8)
at Request.onResponse (D:\Server\htdocs\fmcsaCrawler\node_modules\request\index.js:780:10)
at ClientRequest.g (events.js:192:14)
at ClientRequest.EventEmitter.emit (events.js:96:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:404:27)

@evandrix

+1

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Request.EventEmitter.addListener (events.js:175:15)
at Request.EventEmitter.once (events.js:196:8)
at Request.init (D:\Server\htdocs\fmcsaCrawler\node_modules\request\index.js:376:8)
at Request.onResponse (D:\Server\htdocs\fmcsaCrawler\node_modules\request\index.js:780:10)
at ClientRequest.g (events.js:192:14)
at ClientRequest.EventEmitter.emit (events.js:96:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:404:27)
@jonstorer

+1

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at EventEmitter.addListener (events.js:160:15)
    at EventEmitter.Parser.setup (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/html5/lib/html5/parser.js:2459:17)
    at EventEmitter.Parser.parse_fragment (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/html5/lib/html5/parser.js:2399:8)
    at HtmlToDom.appendHtmlToElement (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/browser/htmltodom.js:95:13)
    at Object.innerHTML (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/browser/index.js:460:17)
    at /javascripts/jquery.min.js:4:6719
    at at (/javascripts/jquery.min.js:4:5136)
    at st.setDocument (/javascripts/jquery.min.js:4:6686)
    at at (/javascripts/jquery.min.js:4:20427)
    at /javascripts/jquery.min.js:4:20587
    at /javascripts/jquery.min.js:5:28399
    at Contextify.sandbox.run (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/node_modules/contextify/lib/contextify.js:12:24)
    at window._evaluate (/Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/windows.js:271:25)
    at Object.HTML.languageProcessors.javascript (/Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/scripts.js:20:21)
    at Object.define.proto._eval (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/level2/html.js:1295:47)
    at Object.loaded (/Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/jsdom_patches.js:134:27)
    at /Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/level2/html.js:51:20
    at Object.item.check (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/level2/html.js:280:11)
    at /Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/jsdom/lib/jsdom/level2/html.js:298:12
    at /Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/resources.js:147:16
    at Request._callback (/Users/jonathons/code/crowdtro/node_modules/zombie/lib/zombie/resources.js:335:16)
    at Request.self.callback (/Users/jonathons/code/crowdtro/node_modules/zombie/node_modules/request/main.js:120:22)
    at Request.EventEmitter.emit (events.js:98:17)
@lbertenasco

BUMP

Does anyone knows hot to fix this problem?
I've been dealing with this for some days and the only solution i found was to search and replace all the
request."whatever" or request(url)
with
request({ uri: url , headers: { "User-Agent": "Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.31 (KHTML, like Gecko) Chrome/26.0.1410."}}, func...

I really hope it helps, do a find in folder with sublime or any other and its quite easy, you can try to modify the options as you want, it works like a charm.

@rodw

@lbertenasco: The "too many listeners" error message is a bit of a red herring. The underlying issue is that you are stuck in a redirect loop.

That is, the server doesn't like something about your request, and is responding with a redirect message (generally an HTTP 301 or 302 status). Your client resubmits the request, but didn't change the thing that the server didn't like, and so receives another redirect response. After 10 cycles through that loop, you hit the "too many listeners" error.

One could bump up the number of listeners that are allowed (as suggested in the error message), but that will only delay the inevitable. After N cycles through the loop, you will hit the "too many listeners" error, for any value of N.

One could (and I'll assert, should) change the code of the request module so that it no longer adds a new listener every time a redirect is followed, but you'll just be replacing one error condition with another. You'll either hit some other natural limit (e.g., the depth of the calling stack) or your code will "hang" in an endless loop (just like 10 GOTO 10 would in Basic).

Ultimately what you need to do is address the redirect loop.

In your case, it looks like the server you are hitting is sending a 301 or 302 response because it doesn't recognize the web browser (user-agent) you're using. Explicitly setting the user-agent header to look like Google Chrome fixes this problem.

Without seeing the actual HTTP response, it is hard to tell if this is an error on the server-side or the client-side.

For instance, if the server is responding with 302 message that redirects the client to the same URL, but expects to receive a user-agent string the second time, this is a mistake on the servers' part.

Alternatively, if the server is responding with a 302 message that redirects the client to some sort of "your browser is not supported" page, but the client (the request module) is resubmitting the request to the original URL, then the bug is on the client's side.

Similarly, in the HackerNews example cited above, the server is looking for a Host header but request is only sending a host header. This causes a redirect loop unless steps are taken to manually add a Host, rather than host, header. (In this case, I believe the news.ycombinator.com server is in error--it should treat header names as case-insensitive--but I presume it is easier to add a little work-around on the client side than to convince Paul Graham to update the news.yc server.)

TL/DR: The request library could handle this a bit better, but at the end of the day you'll need to figure out why you've encountered a redirect loop, and find a way to work around it.

@nickminutello

+1

We are seeing this same error - but strangely only in 1 of our 4 environments. And not all the time.
We are seeing some as-yet-unexplained performance problems in this environment - not sure yet which is cause and symptom...

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at Socket.EventEmitter.addListener (events.js:175:15)
    at Socket.EventEmitter.once (events.js:196:8)
    at ClientRequest.<anonymous> (/pathtoourapp/2.2.58.2/server/node_modules/request/main.js:561:27)
    at ClientRequest.g (events.js:192:14)
    at ClientRequest.EventEmitter.emit (events.js:96:17)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1582:7)
    at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:111:23)
    at Socket.socketOnData [as ondata] (http.js:1485:20)
    at TCP.onread (net.js:404:27)

@nickminutello

Looking a bit closer, we don't appear to have any problem with retry-loops.
What I do see is that the number of listeners for a given connection can get into the hundreds - all handling different (get) requests. Our system does get a bit busy making http requests - the queued requests can get into the 70s - but behaviour is not what I was expecting...

@Rowno

I don't know how request works internally, but I'm queuing up ~180 requests for a screen scraper and I'm getting this error. It looks like a lot of error event listeners are building up in request.js on line 625. I added console.log(response.connection.listeners('error').length); before this line and the number built up quickly, fluctuated a bit and peaked at 40.

Here's how I'm calling request (inside a loop):

request({
    url: url,
    timeout: 60000,
    jar: jar,
    headers: {
        'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_3) AppleWebKit/537.31 (KHTML, like Gecko) Chrome/26.0.1410.65 Safari/537.31'
    }
}, function (error, response, body) {
});
(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at Socket.EventEmitter.addListener (events.js:160:15)
    at Socket.Readable.on (_stream_readable.js:689:33)
    at Socket.EventEmitter.once (events.js:179:8)
    at Request.onResponse (/Users/Roland.Warmerdam/projects/scraper/node_modules/request/request.js:626:25)
    at ClientRequest.g (events.js:175:14)
    at ClientRequest.EventEmitter.emit (events.js:95:17)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1688:21)
    at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:121:23)
    at Socket.socketOnData [as ondata] (http.js:1583:20)
    at TCP.onread (net.js:525:27)
@Zearin

:+1: to @mikeal 's earlier comment.

  • assume lowercase use of host by default
  • if user code uses capitalization Host:
    • maintain capitalization when making requests
    • when capitalized, agnostic code should still be able to ask for host and get the value for Host

Do I have that right?


Edit: Rephrased for clarity.

@mikeal
Owner

@Zearin that should be how things work now. a few months back a lot of caseless functions went in to handle that.

This memory leak crap should be fixed in node 0.12 with the http agent changes.

@seti123

Hi, I simply loop through 20-30 URL's to fetch data and get this error. I tried to set process.setMaxListeners(0). I also tried to create request object in my loop. How to get the right event emitter to increase the maximum (in my case maybe 30 requests - as node is async listeners are added before the requests are made actually - if it would be sync it would request each URL after the other ...). I double checked as long my loop has only 10 URLS the "request" module is happy and no errors shown. But in my use case I can't use it ...

As far I read documentation - the only thing you need to do is "setMaxListeners" on your event Emitter - or allow users to specifiy it like
request(url, callback, 100) // if we loop through 100 urls ... OR provide such a function:
request.getEventEmitter().setMaxListeners (100)

@ottopoika

This actually seems to cause memory leak. My script ran out of memory after 5 hours of runtime.

(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increas
e limit.
Trace
    at Socket.EventEmitter.addListener (events.js:160:15)
    at Socket.Readable.on (_stream_readable.js:689:33)
    at Socket.EventEmitter.once (events.js:185:8)
    at Request.onResponse (D:\Dropbox\projects\domains2\node_modules\request\request.js:713:25)
    at ClientRequest.g (events.js:180:16)
    at ClientRequest.EventEmitter.emit (events.js:95:17)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1688:21)
    at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:121:23)
    at Socket.socketOnData [as ondata] (http.js:1583:20)
    at TCP.onread (net.js:527:27)
@ottopoika

Found also this thread in net module joyent/node#5108

@CentaurWarchief

+1

(node) warning: possible EventEmitter memory leak detected. 
11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at Socket.EventEmitter.addListener (events.js:160:15)
    at Socket.Readable.on (_stream_readable.js:689:33)
    at Socket.EventEmitter.once (events.js:185:8)
    at Request.onResponse (../request/request.js:713:25)
    at ClientRequest.g (events.js:180:16)
    at ClientRequest.EventEmitter.emit (events.js:95:17)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1688:21)
    at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:121:23)
    at Socket.socketOnData [as ondata] (http.js:1583:20)
    at TCP.onread (net.js:527:27)
@watson watson referenced this issue from a commit in watson/request
@watson watson Silence EventEmitter memory leak warning #311
When performing a lot of concurrent requests on a keep-alive connection, a lot of error handlers can be added, which in turn will result in the following warning: "possible EventEmitter memory leak detected. 11 listeners added". This isn't really a memory leak and should just be silenced.
849c681
@chmac

I solved this issue by changing maxRedirects to 3 instead of 10.

Maybe if the default was 8 this error would not be thrown?

@cr1s cr1s referenced this issue in assaf/zombie
Closed

Maximum call stack size exceeded #512

@beamatronic

I am also getting this error on nytimes.com and it appears to be only 3 or 4 redirects.

@scottksmith95

@beamatronic I too was seeing issues with nytimes.com. The problem has to do with cookies. You need to set cookies that are returned in the response and send them in subsequent requests. If you do not, you will get stuck in a redirect loop. Here is an example of how I did it:

var request = require('request');
var tough = require('tough-cookie');

module.exports = function (shortUrl, callback) {
  var jar = request.jar();

  var options = {
    method: 'GET',
    url: shortUrl,
    followAllRedirects: true,
    jar: jar
  };

  request(options, function (error, response) {
    if (error) {
      callback(error, null);
    } else {
      callback(null, response.request.href);
    }
  });
};
@mikeal
Owner

This bug only shows up today, as I understand it, when too many http clients are open in the pool, which is managed by Core and is a Core bug (and fixed in 0.11).

@mikeal mikeal closed this
@nylen nylen referenced this issue from a commit in nylen/request
@watson watson Silence EventEmitter memory leak warning #311
When performing a lot of concurrent requests on a keep-alive connection, a lot of error handlers can be added, which in turn will result in the following warning: "possible EventEmitter memory leak detected. 11 listeners added". This isn't really a memory leak and should just be silenced.
663df43
@Seojiwoong Seojiwoong referenced this issue in Seojiwoong/study
Closed

test #1

@krazylearner

same here

(node) warning: possible EventEmitter memory leak detected. 11 drain listeners a
dded. Use emitter.setMaxListeners() to increase limit.
Trace
    at WriteStream.addListener (events.js:179:15)
    at WriteStream.Readable.on (_stream_readable.js:671:33)
    at Request.Stream.pipe (stream.js:65:8)
    at Request.pipe (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_
2015\nodejs.folders.io\node_modules\request\request.js:1329:34)
    at Fio.post (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2015
\nodejs.folders.io\api.js:9:12661)
    at onList (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2015\n
odejs.folders.io\folders\folders-stub.js:9:759)
    at onList (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2015\n
odejs.folders.io\test\listen.js:51:10)
    at Object.<anonymous> (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\Mar
ch_17_2015\nodejs.folders.io\test\listen.js:41:5)
    at Object.Conduit._targetStep.fn (C:\Users\abhilash\Desktop\CHARLES\gerrit_c
heckout\March_17_2015\nodejs.folders.io\node_modules\postal\node_modules\conduit
js\lib\conduit.js:40:43)
    at Object.next (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2
015\nodejs.folders.io\node_modules\postal\node_modules\conduitjs\lib\conduit.js:
70:33)
(node) warning: possible EventEmitter memory leak detected. 11 error listeners a
dded. Use emitter.setMaxListeners() to increase limit.
Trace
    at WriteStream.addListener (events.js:179:15)
    at WriteStream.Readable.on (_stream_readable.js:671:33)
    at Request.Stream.pipe (stream.js:99:8)
    at Request.pipe (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_
2015\nodejs.folders.io\node_modules\request\request.js:1329:34)
    at Fio.post (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2015
\nodejs.folders.io\api.js:9:12661)
    at onList (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2015\n
odejs.folders.io\folders\folders-stub.js:9:759)
    at onList (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\March_17_2015\n
odejs.folders.io\test\listen.js:51:10)
    at Object.<anonymous> (C:\Users\abhilash\Desktop\CHARLES\gerrit_checkout\Mar
ch_17_2015\nodejs.folders.io\test\listen.js:41:5)
    at Object.Conduit._targetStep.fn (C:\Users\abhilash\Desktop\CHARLES\gerrit_c
@dutchakdev

+1
(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace:
at Request.EventEmitter.addListener (events.js:160:15)
at Request.init (/home/ubuntu/workspace/megatalog-crawler/node_modules/request/request.js:667:8)
at Redirect.onResponse (/home/ubuntu/workspace/megatalog-crawler/node_modules/request/lib/redirect.js:149:11)
at Request.onRequestResponse (/home/ubuntu/workspace/megatalog-crawler/node_modules/request/request.js:1108:22)
at ClientRequest.EventEmitter.emit (events.js:95:17)
at HTTPParser.parserOnIncomingClient as onIncoming
at HTTPParser.parserOnHeadersComplete as onHeadersComplete
at Socket.socketOnData as ondata
at TCP.onread (net.js:527:27)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.