Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

God Daemon taking up huge amounts of memory #1126

Closed
mattstrayer opened this issue Mar 25, 2015 · 22 comments
Closed

God Daemon taking up huge amounts of memory #1126

mattstrayer opened this issue Mar 25, 2015 · 22 comments

Comments

@mattstrayer
Copy link

The app itself runs at about 60-70mb of memory. Running on a small Digital Ocean Droplet (14.04 Ubuntu), but the god Daemon takes up the rest of the 512mb of memory and clogs the system. Node is running v0.12.

@danieljuhl
Copy link

I'm starting to see the same behavior.

PM2 Deamon & PM2 Keymetrics is slowly increasing their memory footprint over time. Running PM2 v0.12.10 with node v0.12.2 on a Digital Ocean Droplet (Ubuntu 14.04).

@jorge-d
Copy link
Contributor

jorge-d commented Apr 13, 2015

Did you integrate pmx as well in your application ? Just trying to figure out whether the issue might come from the pm2 or pmx :)

@danieljuhl
Copy link

@jorge-d No, I haven't used pmx - only pm2 with keymetrics.

@Unitech
Copy link
Owner

Unitech commented Apr 15, 2015

@danieljuhl @mattstrayer how many applications do you use? In which mode (cluster/fork)? Do you processes constantly restart?

If you still have the pm2 logs (~/.pm2/pm2.log) would appreciate to know if something strange happened.

We are using it at Keymetrics to manage processes on 6 different servers (Ubuntu 14.04, Node 0.12.2) running 10 applications (4 in cluster mode and 6 in fork mode) and we don't see this behavior.

@danieljuhl
Copy link

I've seen the issue on an instance which runs two apps: the built-in http service and one other application in simple fork mode ("pm2 start app.js"). That application is a heavy backend app, but does not restart during the test period, where memory is increased for pm2. I use the max memory restart option, but during the test period, that treshold has not been reached.

I'll have a look at the pm2 logs.

@hefangshi
Copy link

I got some issues with pmx too, my application got memory leak when using pm2 0.12, so I checked with heapdump, found that a object using domain was the leak source, and seems related with pmx, so I disabled pmx init in processContainer.js and everything goes fine.

I didn't work out a min demo for this case due to time reason, but just give a hint.

@angrycoding
Copy link

Hi, I also have same issue; CentOS6 32, pm2@0.12.11, node@0.12.2, I'm running one application in fork mode, and according to pm2 monit it eats just 64MB, but htop shows that there are 5 child "God Daemon" processes, as well as 5 copies of my apps running, which is in overall takes about 40% of all memory.

@Unitech
Copy link
Owner

Unitech commented Apr 21, 2015

It's been fixed with the 0.12.11, we now force the GC, now PM2 always stay at around 50mb (4 benchmarks on 4 machines with bad application behaviors)

@angrycoding #1088

@Unitech Unitech closed this as completed Apr 21, 2015
@angrycoding
Copy link

Yeah I'm sorry, spend all day on investigating this thing, came out that this was unix caches that was growing... after clearing caches everything runs smoothly, thanks for the great product!

@imgovind
Copy link

Thank you Alexandre.

From: Alexandre Strzelewicz [mailto:notifications@github.com]
Sent: Tuesday, April 21, 2015 10:24 AM
To: Unitech/PM2
Cc: Govindarajan Panneerselvam
Subject: Re: [PM2] God Daemon taking up huge amounts of memory (#1126)

It's been fixed with the 0.12.11, we now force the GC, now PM2 always stay at around 50mb (4 benchmarks on 4 machines with bad application behaviors)

@angrycoding https://github.com/angrycoding #1088 #1088


Reply to this email directly or view it on GitHub #1126 (comment) . https://github.com/notifications/beacon/AEElBuc4pfYLlTM8A67vYe9XJf-mvpzkks5oBmMjgaJpZM4D016p.gif

@hefangshi
Copy link

👎 force gc is not a good idea, and need user to expose gc which might be abused.

I strongly recommand that we should look into pmx #1126 (comment) rather than trigger gc manually.

@andersonmadeira
Copy link

I'm using version 12.12 and I still have this issue. The amouth of memory used by my application gets bigger and bigger. It seems that garbagge collection is not working at all.

@mpechner
Copy link

Using 0.15.10 and it also blows up.
Nodejs version 4.2.2

One app is all it runs: pm2 start /opt/node/latest/app.js

@neekfenwick
Copy link

FYI I'm troubleshooting this here, I think in my case the issue is that pm2 is being run as root and ubuntu users alternately. I'm using Amazon CodeDeploy, where the lifecycle scripts ApplicationStart and ValidateService are 'accidentally' being run as ubuntu and root users respectively. ApplicationStart starts the app with pm2 start --name foo blah blah blah, then ValidateService runs pm2 info foo in an attempt to list its info with an exit status of zero, and thus indicate to CodeDeploy that it's running OK. After this, I see this in top:

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND                                                                                        
16324 root      20   0 3542056 2.575g   8560 R 100.2 33.0   7:43.29 PM2 v1.1.3: God                                                                                
16306 ubuntu    20   0 1043140  43784   9188 S   0.0  0.5   0:00.76 node /home/ubun                                                                                
 8403 root      20   0  323236  35876   3408 S   0.0  0.4   0:02.53 codedeploy-agen                                                                                
13923 ubuntu    20   0  945328  25356   3976 S   0.0  0.3   0:02.42 PM2 v1.1.3: God                                                                                
 8399 root      20   0   93624  16604      0 S   0.0  0.2   0:00.38 codedeploy-agen      

So there are two God processes, the one owned by root has gone to 100% CPU and its memory footprint just keeps on growing (while typing this it went from 2.5Gb to 2.9Gb).

@Dublerq
Copy link

Dublerq commented Jul 5, 2016

Same issue here on Ubuntu 14.04, PM2 1.1.3 and node 4.4.7. The problem exists even without running any app

@Unitech
Copy link
Owner

Unitech commented Jul 5, 2016

please screenshot htop,

also install the new PM2 v2:

$ npm install Unitech/pm2#development -g
$ pm2 update

@Dublerq
Copy link

Dublerq commented Jul 5, 2016

I was using latest pm2 downloaded a 2 days ago. Somehow I've managed to fix it just by clearing all the cache, dumps, logs, and startup scripts. Can't reproduce the problem anymore

@Unitech
Copy link
Owner

Unitech commented Jul 5, 2016

permission problems may happen sometimes

you need to check that ~/.pm2 has the right permissions

$ pm2 save
$ chown -R $USER:$USER ~/.pm2
$ pkill -f PM2
$ pm2 resurrect

@Pranav-Saxena
Copy link

Pranav-Saxena commented Jun 16, 2021

hey this issue is occuring again on PM2 v5.1.0 what to do? it takes around 400 mb of ram

@victor-ono
Copy link

I'm started experiencing this issue as well after updating. It takes 50% of RAM on a 512MB machine!
Screen Shot 2021-08-06 at 1 10 24 PM

@yi
Copy link

yi commented Dec 17, 2021

I just experienced this issue and found the root cause is NodeJS backpressure. Basically, pm2 failed to flush the log to disk in time (too much log output). And this issue was solved by reducing log output.

@shizheng163
Copy link

shizheng163 commented Sep 12, 2023

vizion cause use large memory , you can see the issue: #5051

you can use –no-vizion start your app to avoid the problem.

https://pm2.io/docs/runtime/reference/pm2-cli/

https://pm2.keymetrics.io/docs/usage/application-declaration/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests