EventEmitter memory leak and Parse Error #622

steffenmandrup opened this Issue Aug 11, 2013 · 3 comments

3 participants


I'm doing a lot of requests (around 29000) because i want to parse a lot of product sites.
I send in url's like this:


But over time i get a single Parse Error and following a lot:

(node) warning: possible EventEmitter memory leak detected. 11 listeners added.
Use emitter.setMaxListeners() to increase limit.
    at Socket.EventEmitter.addListener (events.js:160:15)
    at Socket.Readable.on (_stream_readable.js:679:33)
    at Socket.EventEmitter.once (events.js:179:8)
    at Request.onResponse (C:\Vuuh\trunk\feedserver\node_modules\request\index.j
    at ClientRequest.g (events.js:175:14)
    at ClientRequest.EventEmitter.emit (events.js:95:17)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1669:21)
    at HTTPParser.parserOnHeadersComplete [as onHeadersComplete](http.js:120:23
    at Socket.socketOnData [as ondata] (http.js:1564:20)
    at TCP.onread (net.js:525:27)

This continues untill it breaks due to "out of memory".

The code:

            var string = product.Links.Direct_Link;
            request({'uri': string, 'jar': false, 'followRedirects': false, 'maxSockets': 15, 'headers': { "User-Agent": "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:18.0) Gecko/20100101 Firefox/18.0"}}, getSizes);

all the URI are 100% correct and all pages fine.

Is it something with the amount of requests?

Got the same script running on another website with around 12100 pages with no problems, so i am blank...


You might want to test this #623


Fixes my issue :) - Thanks for finding a quick solution


Seems it can run now, but after doing around 80% of the pages it starts to "hang". will be investigating a bit (tomorrow) but while you are working on stuff, you might find the issue :)

@mikeal mikeal closed this Aug 28, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment