Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v8 heapTotal keeps rising when heapUsed does not #1484

Closed
eladnava opened this issue Sep 7, 2018 · 15 comments
Closed

v8 heapTotal keeps rising when heapUsed does not #1484

eladnava opened this issue Sep 7, 2018 · 15 comments

Comments

@eladnava
Copy link

eladnava commented Sep 7, 2018

  • Node.js Version: v8.11.4 LTS
  • OS: Ubuntu 16.04 LTS

When running my app with a --max-old-space-size=16384 the heapTotal keeps rising although the heapUsed remains pretty much the same:

screen shot 2018-09-07 at 7 28 13 pm

global.gc() is run every 3 minutes precisely. The machine itself has exactly 16GB of RAM. Eventually this behavior leads to an OOM and node is killed.

Now originally I thought this was a memory leak but since heapUsed remains constant it can't be, can it? Is it a native memory leak possibly? Or just the heapTotal growing uncontrollably due to some bug in node core?

Interestingly as well, the gap between rss and heapTotal keeps growing over time, due to some native memory leak outside v8 heap?

Any pointers as to what's going on would be super helpful. Unfortunately I cannot provide code to reproduce currently, but I can say that the load on the server and number of concurrent connections remains the same throughout the test.

I'm also considering testing with a lower --max-old-space-size to see if it forces node to keep the heapTotal a certain size.

Thanks!

@eladnava
Copy link
Author

eladnava commented Sep 7, 2018

According to this, the difference between rss and heapUsed is most likely due to Buffers which are allocated in native memory instead of the v8 heap, so the fact that the gap is increasing means that my node app is leaking Buffers?

@Hakerh400
Copy link

global.gc() is run every 3 minutes

Possibly related to nodejs/node#22229

@addaleax
Copy link
Member

addaleax commented Sep 7, 2018

Is it a native memory leak possibly?

That would typically lead to RSS rising, but not heapTotal?

@eladnava
Copy link
Author

eladnava commented Sep 7, 2018

@Hakerh400 Thanks for the insight, it seems that the fix for that memory leak landed in v10.9.0, strangely not yet in v8.11.4 LTS.

However I did actually test the app before with Node v10.10.0 and witnessed some kind of leak. It might have been a different one though so I will retry.

@addaleax Thanks for the insight as well, good point. Any idea what would be causing heapTotal to rise not in conjunction with heapUsed? Possibly lots of Buffer objects that have memory allocations in native memory and for some reason fooling v8 to increase the heapTotal?

@addaleax
Copy link
Member

addaleax commented Sep 7, 2018

@eladnava I think that would be visible in a heap snapshot, if you can generate + inspect one?

@eladnava
Copy link
Author

eladnava commented Sep 8, 2018

@Hakerh400 I removed all invocations of global.gc() and this is the result:

screen shot 2018-09-08 at 11 24 10 am

Seems like rss is still rising but heapTotal remains somewhat the same average in conjunction with heapUsed? Would a Buffer leak be the explanation for the rising rss?

@addaleax I'll generate a heap snapshot soon.

@eladnava
Copy link
Author

eladnava commented Sep 8, 2018

@addaleax Here is a heap snapshot for the following process.memoryUsage() (and collecting garbage from within DevTools):

{ rss: 338,849,792,  heapTotal: 37,154,816,  heapUsed: 33,220,160,  external: 22,399,736 }

screen shot 2018-09-08 at 2 40 50 pm 1

This was taken after removing all traffic from the server (right after generating the graph on my last post), so that no clients were connected but the leak persists as something in the rss is taking up memory. The leak does not appear to be within the v8 heap as the heapUsed is only 33MB where the process rss is 338mb. That's 305mb of unaccounted for native memory allocation which I have no idea how to track down.

Some allocation is lingering in the rss and failing to be released. Any pointers on what it could be? Any tools I can use to find out what is taking up this memory? Obviously Chrome DevTools won't be of much help I believe as the snapshot clearly doesn't account for this memory and only contains the heap.

@eladnava
Copy link
Author

eladnava commented Sep 8, 2018

It seems the symptom is now rss keeps rising when heapTotal does not.

@eladnava
Copy link
Author

eladnava commented Sep 8, 2018

It seems that the rss growing memory is not caused by buffers in my case as according to this the external metric reported by process.memoryUsage includes memory used by buffers.

procss.memoryUsage() reports 22mb for external (i.e. buffers) whereas rss is at 318mb.

{ rss: 318873600,  heapTotal: 44494848,  heapUsed: 32599072,  external: 22350912 }

I am not using any native C libraries or dependencies.

Therefore I am completely at a loss as to what is occupying memory in rss and have no idea how I can actually find out what it is, if it isn't Buffers or Native C libraries.

Any help will be greatly appreciated.

@eladnava
Copy link
Author

eladnava commented Sep 8, 2018

I'm also considering the option that there is no longer a leak but the OS prefers not to reclaim rss memory since lots of free memory is available.

@shellberg
Copy link

@eladnava You should consider that RSS is actually not an intrinsic measure of your code, but its also competitive with whatever else is in your workload at the time - that is, all other processing, including the OS. If you have no other significant workload, then your Node.js processing will be the most active foreground job running, and it will win this competition (and RSS will rise) and dominate for access to physical storage to support processing; if you have some other activity, then there is some competition that will cause (least-used usually) pages of memory to be paged-out in favour of the other workload. This is perfectly normal.
(Indeed, inducing GC activity will require the job to page-in significant parts of the whole JS heap.)

Indeed, with a mixed workload of activity, all things being equal, RSS can still vary! This is what RSS (Resident Set Size) is all about... its the portion (set) of the allocated memory of our process that is currently resident in physical memory based on current activity relative to current workload..

@eladnava
Copy link
Author

eladnava commented Sep 11, 2018

@shellberg Thanks for this insight! That definitely appears to be the case.

It seems that I have managed to fix the issue! Thing is I don't know exactly what caused it but I changed 10 things at once and looks like memory consumption is stable now:

screen shot 2018-09-11 at 10 50 50 am

At last, heapTotal and heapUsed are stable for the same workload. Indeed the only process on the server consuming resources is the Node.js process which would explain the rising rss as there is no competition and lots of free memory.

Thank you all @shellberg @addaleax @Hakerh400 for your help! It is greatly appreciated. 😄

@vaibhavi3t
Copy link

@eladnava
I'm also facing same issue with Node v8.16.0. I'm trying to fix from last 2 week but no luck.
Can you please guide me what you changed to fix this issue.

@eladnava
Copy link
Author

@vaibhavi3t Unfortunately I don't remember as it was too long ago, but all I can recommend is trying to tinker with the settings mentioned in this issue to see if it has any effect.

@vaibhavi3t
Copy link

@eladnava
I'm new to node system. I'm attaching the prod node memory screenshot. Please have a look.
I'm not able to fix the memory leak issue here.

Memory Resident size bytes
Screenshot 2019-07-22 at 8 08 38 PM
Screenshot 2019-07-22 at 8 09 20 PM
Screenshot 2019-07-22 at 8 09 31 PM
Screenshot 2019-07-22 at 8 09 43 PM
Screenshot 2019-07-22 at 8 09 58 PM
Screenshot 2019-07-22 at 8 10 15 PM
Screenshot 2019-07-22 at 8 10 25 PM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants