Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP

Loading…

Memory leak from Node with request.setTimeout #389

Closed
sylvinus opened this Issue · 2 comments

3 participants

@sylvinus

Hi,

There is a memory leak happening here ...
https://github.com/mikeal/request/blob/master/main.js#L682

... with older versions of node. I can confirm it's happening with node 0.8.4 but not 0.8.14 and it was probably fixed in one of these:
nodejs/node-v0.x-archive#3076
nodejs/node-v0.x-archive#3068
nodejs/node-v0.x-archive#3788

It's of course not an issue with request but maybe you should disable this 2nd timeout feature for older versions of node (it is already not supported before 0.6?), because request users stuck with older versions of node will have to either use a patched request or disable timeouts altogether.

You can test this by running npm test on crawler:
https://github.com/sylvinus/node-crawler/blob/master/test/units/leaks.js

Thanks!

@sylvinus sylvinus referenced this issue in sylvinus/node-crawler
Closed

Crawler getting really slow after ~ 1000 crawled pages #28

@yyfrankyy

I just met this error on 0.8 and 0.10 (not about bugs in node you mentioned, which may be fixed in 0.8.14+), when I used this for default requesting and there were lots of timeouts (network just turn bad), in my case, my concurrency was about 1000 requests at the same time (at least 200 timeouts).

exports.request = request.defaults({
  timeout: +(process.env.TIMEOUT) || 1000 * 30
})

So I added this code in request, and that worked.

self.req.once('socket', function(socket) {
  socket.setMaxListeners(1000/* or more */)
})  

Or just use less requests parallelly.

How about expose an option for this specific situation?

@mikeal
Owner

Is this still an issue?

This is so old I'm closing, if it is actually still an issue just let me know and I'll re-open.

@mikeal mikeal closed this
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.