Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

high rss and steady and low heap usage #12805

Closed
terrywh opened this issue May 3, 2017 · 24 comments
Closed

high rss and steady and low heap usage #12805

terrywh opened this issue May 3, 2017 · 24 comments
Labels
inspector Issues and PRs related to the V8 inspector protocol memory Issues and PRs related to the memory management or memory footprint.

Comments

@terrywh
Copy link

terrywh commented May 3, 2017

Version: Node v7.8.0
Platform: Linux (CentOS 7.0 x64)

image

I am using --inspect to get get heap snap shots. in the heap snap shots, it show a memory usage of only 27MB, but when seeing in the system it show a large rss usage.

message from console:
image
snap from top command:
image
heap snapshots:
image

am i missing something ? or is this suppose to be? this project is using some third party (C++) modules.

i try to add --max_old_space_size=128 options but i got no luck, it seems this options has not been working at all ( i also tried options that using '-' insdead of '_' --max-old-space-size=128, it seems to be also not working);

@vsemozhetbyt vsemozhetbyt added inspector Issues and PRs related to the V8 inspector protocol memory Issues and PRs related to the memory management or memory footprint. labels May 3, 2017
@mscdex
Copy link
Contributor

mscdex commented May 3, 2017

@terrywh Can you reproduce it without the compiled addons? It could be that the addons you're using do not let V8 know about memory they have allocated/deallocated (via AdjustAmountOfExternalAllocatedMemory())?

@bnoordhuis
Copy link
Member

Do you have transparent huge pages enabled? Try turning it off, see #11077.

@terrywh
Copy link
Author

terrywh commented May 3, 2017

@mscdex i'm using https://github.com/Blizzard/node-rdkafka, i will try to see if i can reproduce without it.
@bnoordhuis already turned off :
image

@dlueth
Copy link

dlueth commented May 17, 2017

Same Problem here on EBS t2.micro as well as t2.medium instances. "max_old_space_size" and the likes had no influence and disabling transparent huge pages did not help either.

I stripped down the app to do exactly nothing but still, RSS constantly grows while Heap stays a flat line.

@dlueth
Copy link

dlueth commented May 18, 2017

In addition to my previous post: checked this behaviour​ on 7.10.0, 7.6.0 and the only working fix was to go back to 6.10.3

@terrywh
Copy link
Author

terrywh commented May 18, 2017

@dlueth "do exactly nothing" so what does it do specifically?

@dlueth
Copy link

dlueth commented May 18, 2017

@terrywh Sorry for having been a bit unspecific: Nothing left but an idle loop, just to keep the process running

@terrywh
Copy link
Author

terrywh commented May 18, 2017

@dlueth maybe you can give us some code to reproduce this problem? my problem seems to be caused by the c++ module.

@dlueth
Copy link

dlueth commented May 18, 2017

@terrywh If I find the time tonight I will try to prepare something

@dlueth
Copy link

dlueth commented May 19, 2017

Will take some more time to prepare an example of our problem - will report back!

@bnoordhuis
Copy link
Member

Closing due to inactivity. Can reopen if needed.

@terrywh
Copy link
Author

terrywh commented Jun 26, 2017

I file another issue on the c++ module Blizzard/node-rdkafka, and will report back if i got anything concrete.

@owenallenaz
Copy link

@dlueth Could you possibly test on Node 7.4.0? We are facing a similar large RSS, no heap growth issue, and I swear it used to work on 7.4.0. If you're unable to reproduce on that version, then it might narrow the window of what changed that caused it. Or maybe it's a 3rd party package... not sure yet.

@dlueth
Copy link

dlueth commented Jul 17, 2017

@owenallenaz Will see what I can do to test this and report back!

@dlueth
Copy link

dlueth commented Jul 17, 2017

@owenallenaz Did not help in our case, switching to 7.4.0 shows the same behaviour regarding large RSS but constant heap as later versions do

@bam-tbf
Copy link

bam-tbf commented Feb 3, 2018

@dlueth I am seeing similar issues even on node 8.9.3 and 9.5.0 streaming image files from an image server through a zip archiving routine then by express down to the user. I have tried destroying every stream created manually and calling garbage collection manually, but the RSS space is high, never decreases while heap stays relatively constant and low. Thank you for any update. This data graphed here covers the period of manual destroying and garbage collection for example.

screen shot 2018-02-02 at 5 45 25 pm

@dlueth
Copy link

dlueth commented Feb 6, 2018

@bam-tbf In our case the issue is not solved but the overall situation improved, at least with 8.9.4. There still is a memory leak burried somewhere but it is now within acceptable limits, restarting our instances once a week is sufficient so far.

One of the causes was an external script we depended on that, in the meantime, fixed its leaks (and we got finally rid of in the end anyway) but the guys behind it mentioned that there still are some potential leaks left within node itself.

@Restuta
Copy link

Restuta commented Aug 17, 2018

@dlueth if it's a memory leak shouldn't it consume heap space?

@dlueth
Copy link

dlueth commented Aug 18, 2018

@Restuta it normally should but somehow node seemed to eventually loose track of it and assume it as non-heap

@akaJes
Copy link

akaJes commented Aug 20, 2018

try to use the jemalloc library - it prevent the RSS grownig

#21973

@hakimelek
Copy link

hakimelek commented Oct 31, 2020

We are still experiencing the same issue in v12.16.3, any leads about how to track down what is causing RSS to grow? No luck with the devtools since it's reporting only the heap memory.

@gireeshpunathil
Copy link
Member

@hakimelek - what is your system's memory configuration? I am suspecting something similar to #31641 , wherein a large demand of memory in the process that occurred and satisfied in the past, but never released back due to the fact that the system did not want those back as it has ample free memory.

@hakimelek
Copy link

hakimelek commented Nov 1, 2020

@gireeshpunathil Yes there are 4 processes running on an 8Gb RAM. When inspecting the processes memory I see both the Virtual Size and Resident Size trending up under load, these are retrieved in New Relic. These boxes in productions are running in v10.21.0.

I have tried to reproduce the same locally even after upgrading to v12.16.3 and tap on process.memoryUsage() directly and I noticed that only RSS gets bigger but the heapUsed remain constant overtime. It is running on a 16Gb RAM machine.

After a couple of heap snapshots in production, I am wondering if that the memory growing perceived by the system is not caused by the heap memory and I am not sure now what tools can I used to get insights around the RSS memory.

11/2 Update:

This is the behavior I see in the application locally
image

I would assume that every time GC is kicked in and heap memory drops, RSS should drop as well. Am I missing anything?

@gireeshpunathil
Copy link
Member

thanks for the charts. As mentioned in the referenced issue, the rss memory does not automatically come down in response to memory free-up from the process, instead only in response to memory demands from the rest of the system (other processes wanting to use memory). So even if it shows high, you can consider that those are not really used in your application.

I would assume that every time GC is kicked in and heap memory drops, RSS should drop as well. Am I missing anything?

gc does not lead to reduction in rss. Instead, reduction of valid objects in the heap.

My summary: unless the rss growth in unbounded and leading up to memory exhaustion, you don't have to worry about the current rss value.

Hope this helps.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
inspector Issues and PRs related to the V8 inspector protocol memory Issues and PRs related to the memory management or memory footprint.
Projects
None yet
Development

No branches or pull requests