Memory leak from Node with request.setTimeout #389

Closed
sylvinus opened this Issue Dec 9, 2012 · 2 comments

Comments

Projects
None yet
3 participants

sylvinus commented Dec 9, 2012

Hi,

There is a memory leak happening here ...
https://github.com/mikeal/request/blob/master/main.js#L682

... with older versions of node. I can confirm it's happening with node 0.8.4 but not 0.8.14 and it was probably fixed in one of these:
joyent/node#3076
joyent/node#3068
joyent/node#3788

It's of course not an issue with request but maybe you should disable this 2nd timeout feature for older versions of node (it is already not supported before 0.6?), because request users stuck with older versions of node will have to either use a patched request or disable timeouts altogether.

You can test this by running npm test on crawler:
https://github.com/sylvinus/node-crawler/blob/master/test/units/leaks.js

Thanks!

@sylvinus sylvinus referenced this issue in bda-research/node-crawler Dec 9, 2012

Closed

Crawler getting really slow after ~ 1000 crawled pages #28

Contributor

yyfrankyy commented Jul 3, 2013

I just met this error on 0.8 and 0.10 (not about bugs in node you mentioned, which may be fixed in 0.8.14+), when I used this for default requesting and there were lots of timeouts (network just turn bad), in my case, my concurrency was about 1000 requests at the same time (at least 200 timeouts).

exports.request = request.defaults({
  timeout: +(process.env.TIMEOUT) || 1000 * 30
})

So I added this code in request, and that worked.

self.req.once('socket', function(socket) {
  socket.setMaxListeners(1000/* or more */)
})  

Or just use less requests parallelly.

How about expose an option for this specific situation?

Owner

mikeal commented Aug 27, 2014

Is this still an issue?

This is so old I'm closing, if it is actually still an issue just let me know and I'll re-open.

@mikeal mikeal closed this Aug 27, 2014

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment