Skip to content


Subversion checkout URL

You can clone with
Download ZIP


Memory leak from Node with request.setTimeout #389

sylvinus opened this Issue · 2 comments

3 participants



There is a memory leak happening here ...

... with older versions of node. I can confirm it's happening with node 0.8.4 but not 0.8.14 and it was probably fixed in one of these:

It's of course not an issue with request but maybe you should disable this 2nd timeout feature for older versions of node (it is already not supported before 0.6?), because request users stuck with older versions of node will have to either use a patched request or disable timeouts altogether.

You can test this by running npm test on crawler:


@sylvinus sylvinus referenced this issue in sylvinus/node-crawler

Crawler getting really slow after ~ 1000 crawled pages #28


I just met this error on 0.8 and 0.10 (not about bugs in node you mentioned, which may be fixed in 0.8.14+), when I used this for default requesting and there were lots of timeouts (network just turn bad), in my case, my concurrency was about 1000 requests at the same time (at least 200 timeouts).

exports.request = request.defaults({
  timeout: +(process.env.TIMEOUT) || 1000 * 30

So I added this code in request, and that worked.

self.req.once('socket', function(socket) {
  socket.setMaxListeners(1000/* or more */)

Or just use less requests parallelly.

How about expose an option for this specific situation?


Is this still an issue?

This is so old I'm closing, if it is actually still an issue just let me know and I'll re-open.

@mikeal mikeal closed this
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.