Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible memory leak in v4 on node v6 #2997

Closed
lykkin opened this issue May 13, 2016 · 20 comments
Closed

Possible memory leak in v4 on node v6 #2997

lykkin opened this issue May 13, 2016 · 20 comments

Comments

@lykkin
Copy link

@lykkin lykkin commented May 13, 2016

Using the example code below on express v4.x and node v6.x causes the process to OOM when put under load. The server was put under ~30 requests/sec.

var express = require('express');
var app = express();
app.get('/', function (req, res) { 
  throw new Error('hello')
});
app.listen(3000, function () {console.log('done')});
<--- Last few GCs --->

  120821 ms: Mark-sweep 1206.8 (1435.0) -> 1197.9 (1435.0) MB, 142.4 / 0 ms [allocation failure] [GC in old space requested].
  120990 ms: Mark-sweep 1197.9 (1435.0) -> 1197.9 (1435.0) MB, 169.1 / 0 ms [allocation failure] [GC in old space requested].
  121157 ms: Mark-sweep 1197.9 (1435.0) -> 1197.9 (1435.0) MB, 166.3 / 0 ms [last resort gc].
  121326 ms: Mark-sweep 1197.9 (1435.0) -> 1197.8 (1435.0) MB, 168.9 / 0 ms [last resort gc].


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x18640e7c9fa9 <JS Object>
    1: isToplevel(aka CallSiteIsToplevel) [native messages.js:~369] [pc=0x3553d6016245] (this=0x27b340215d89 <a CallSite with map 0x1f2a45e2b2f9>)
    2: toString(aka CallSiteToString) [native messages.js:~403] [pc=0x3553d601cf09] (this=0x27b340215d89 <a CallSite with map 0x1f2a45e2b2f9>)
    3: FormatStackTrace [native messages.js:561] [pc=0x3553d5f6ca41] (this=0x18640e7e81c1 <JS Global Object...

FATAAbort trap: 6

This doesn't occur in node v4.x.

I wrote something similar using the http module, which doesn't exhibit this behavior:

var http = require('http')

process.on('uncaughtException', function (e) {console.log(e)})

var server = http.createServer(function (req, res) {
  throw new Error('test')
})

server.listen(3000, function () {
  console.log('listening on port 3000')
})
@dougwilson
Copy link
Member

@dougwilson dougwilson commented May 13, 2016

Thanks for the report, we will take a look! Express 4.x does not yet support Node.js 6.0, so it's not a high priority, but we will certainly take a look when possible. If you have any suggested fixes or what the problem is, that would help tremendously!

Looking at the stack, I don't think that is an exception from a OOM condition, but rather a failed assertion from within C++.

@dougwilson
Copy link
Member

@dougwilson dougwilson commented May 13, 2016

Because of the way Express catches your exception, I would no say the two example are equivalent. Can you also try with the following Node.js example to see if under your load, it still works?

var http = require('http')

var server = http.createServer(function (req, res) {
  try {
    throw new Error('test')
  } catch (e) {
    res.statusCode = 500
    res.write(e.stack)
    res.end()
  }
})

server.listen(3000, function () {
  console.log('listening on port 3000')
})
@dougwilson
Copy link
Member

@dougwilson dougwilson commented May 13, 2016

Looking at the stack, I don't think that is an exception from a OOM condition, but rather a failed assertion from within C++.

haha, I didn't scroll al the way to the right on the GC messages, my bad :)

@dougwilson
Copy link
Member

@dougwilson dougwilson commented May 13, 2016

@lykkin can you say which specific point version you were testing under (just pick one if it was multiple) so we can focus on testing on the same Node.js build you were running on?

@lykkin
Copy link
Author

@lykkin lykkin commented May 13, 2016

@dougwilson i tested this under 4.13.4, as well as the latest patch versions down to 4.9.8

running with your provided code example there is no memory growth

@dougwilson
Copy link
Member

@dougwilson dougwilson commented May 13, 2016

I'm looking for the version of Node.js

@lykkin
Copy link
Author

@lykkin lykkin commented May 13, 2016

ah, sorry. tested on 6.1.0 and 6.0.0

@dougwilson
Copy link
Member

@dougwilson dougwilson commented May 13, 2016

Thanks! I'll try to take a look sometime in the next few weeks, unless someone can get to it before me :) If you can provide some insights into the cause as well, that would be much appreciated!

If it helps anyone who wants to investigate, these are the things I would look into:

  1. What are the objects that are being held in memory? To what are they being rooted to?
  2. What version of Node.js did this first break in (basically, a bisect against Node.js versions). I would use the public versions first, then narrow it down using the nightlies between those versions.
@rzk
Copy link

@rzk rzk commented May 14, 2016

Express 4.x does not yet support Node.js 6.0, so it's not a high priority

wait, what? where this is stated? (apart from travis config)

I've been writing all my express apps using newer versions of node using es6 features that you can check out at http://node.green/ (without --harmony) and now you are saying that I need to revert back to something like 4.4.2 and that that is not high priority?

I feel like I'm not alone in this boat.

@dougwilson
Copy link
Member

@dougwilson dougwilson commented May 16, 2016

Hi @rzk, historically we have tried to actually vet Express and it's code on a new major version of Node.js/io.js in order to support that version. This is not to say that Express won't work on the version, but until we vet we, we honestly cannot say if it fully functions, if there are any "gotchas", etc. I just wanted to make it clear, because you never know what will happening when you jump to a new runtime, especially in production. There may be a slow memory leak, and change in an API that introduces a security vulnerability, etc.

For the past few weeks, I have been combing through Node.js 6.x and the Express code base (this is many dozens on modules, not just the code in this repository) to check that everything is working on Node.js 6.x as expected. We have various modules that have to shim Node.js to add missing API pieces, which are the most likely things to break down on a major upgrade, since they have certain expectations of what the Node.js source code look like.

and now you are saying that I need to revert back to something like 4.4.2 and that that is not high priority?

I'm not saying that, only that you are definitely leading the pack and that as you find issues, please report them back, like this issue has done, so we can investigate along with the current Node.js 6.x support investigation.

@rzk, since you are using Node.js 6.x, you may very well be encountering this memory leak that is reported, especially since the reported reproduction is so simple. Would you be able to help us diagnose the issue, please? Anyone else in the Express group can also help out, but I have not heard from anyone, so I don't think anyone is actively looking into this issue, and I am working on a security issue that needs to be released, which is why this is not a high priority (the security release is).

@hacksparrow
Copy link
Member

@hacksparrow hacksparrow commented May 17, 2016

@lykkin can you share your OS and machine specs, and the exact procedure to reproduce and observe the behavior? And does the app crash?

I tested Express 4.13.1 with Node.js 4.4.4, and 6.1.0 on my 3.2 GHz Mac with 8 GB of RAM. The behavior is largely similar for 4.4.4 and 6.1.0; both start at 16MB and climb to ~21MB and stay there during the period of observation.

I used siege for making the requests to the app:

$ siege http://localhost:5000 -c30 -d1 -t5s 

This makes 30 concurrent requests to the app for 5 seconds, with a delay of 1 second for each user. The observation is made using top.

@lykkin
Copy link
Author

@lykkin lykkin commented May 17, 2016

@hacksparrow Sure, I'm running on a 2.5 GHz mac with 16 GB of RAM running OSX 10.10.

I'm running the code snippet above on node 6.1.0 and express 4.13.4.

I noticed that when the load increases the speed of memory growth also increases. Using gobench to apply load ($ gobench -c=30 -u="http://localhost:3000" -t=600) the process quickly balloons to ~200MB and continues till the process crashes upon hitting the memory limit.

I am monitoring memory usage of the process using htop

@hacksparrow
Copy link
Member

@hacksparrow hacksparrow commented May 18, 2016

Thanks for the details @lykkin, I will review again.

@hacksparrow
Copy link
Member

@hacksparrow hacksparrow commented May 18, 2016

gobench made all the difference.

I can now observe the RAM consumption growing drastically and CPU usage shooting up to 100% too.

We will look further into this issue. Thanks for reporting, @lykkin.

@dougwilson
Copy link
Member

@dougwilson dougwilson commented May 19, 2016

Hi @hacksparrow, have you figured anything out? I cannot find any actual memory leak so far. I took the example app, ran it on Node.js 6.1.0 and pounded it with requests for 3 days and it's still hovering around 100 MB RSS. If there was a memory leak, you'd think that it would trigger, no matter how slowly?

I have not tried to look at what it would take to run gobench on Windows yet, so haven't gotten any kind of reproduction going on my end. @lykkin is it possible you can assist with the diagnosis in some way? Perhaps see if you can help answer the questions in #2997 (comment) for us?

@dougwilson
Copy link
Member

@dougwilson dougwilson commented May 19, 2016

Also, @lykkin @hacksparrow, if one of you have time, you can add the command line argument --stack_trace_limit=0 to your node command to see if that makes any difference? Reading through the Node.js repository, I see a few issues like nodejs/node#4910 that were opened from a behavior change in creating Error objects in v8 (which is deemed not a bug by v8 developers), and since you have a throw in your example, I assume that is an important aspect to reproduce the issue. Setting the trace limit to 0 will prevent the Error objects from holding any stack frames, so if that makes a difference, that could mean something here, because I haven't seen any indication of a memory leak, just that Node.js some time after v4 stopped being able to handle as many concurrent requests, which larger memory requirements of Error objects may be the reason.

@hacksparrow
Copy link
Member

@hacksparrow hacksparrow commented May 20, 2016

@dougwilson nothing yet, let me add --stack_trace_limit=0 and see if it makes any difference.

@hacksparrow
Copy link
Member

@hacksparrow hacksparrow commented May 20, 2016

So --stack_trace_limit=0 stops the RAM consumption from ballooning.

@hacksparrow
Copy link
Member

@hacksparrow hacksparrow commented May 20, 2016

It indeed seems to be because of the Error objects holding the stack frames.

@dougwilson @lykkin maybe we can close this now?

@lykkin
Copy link
Author

@lykkin lykkin commented May 20, 2016

I'm ok with closing this since it isn't something happening in express causing it. Thanks for looking into it, though!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
4 participants