-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PM2 v5.1.0: God Daemon taking up huge amounts of memory #5145
Comments
same issue, any updates? |
I have experienced the same problem with 5.2. The pm2 process takes up about 2.6G memory while the process it controls takes much lower memory The following is the output of
|
The CPU usage is almost always less than 1%, that is why I suspect there is some memory leak somewhere
|
No I don't think so and nodejs/node#21973 was closed |
I'm still having the same issue. |
I am facing the same issue - when I run the my node application with pm2 it takes huge amount of memory and crashes but when I simply just run "node app.js" then it run without any memory issue and crash. |
@sarojmhrzn which node version did you use? Now I use node 16.4 to see if this can fix the problem. |
i'm still having the same issue. |
same here. pm2 v5.2.0 and seems its not my code because when running with |
also I had 2 similar pieces of code and one didnt have thise memory leak problem.... |
The daemon memory leak is insane, on a 256GB RAM VPS, pm2 is managing about 50 puppeteer + chrome isntances and yet pm2 Daemon is taking up 150GB+ of RAM! HOW can I run the daemon with pm2 report
InvestigationUpdate 1:This is how you can turn on inspect on the daemon:
Update 2:From initial inspection, it looks like the fact that Reference:soyuka/pidusage@ff04d9e Update 3:After hooking into the process on the machine with the issue, it looks like the heap is fine but
Update 4:The The solution to this issue is to let # as root
> apt-get install libjemalloc-dev Then to start your pm2 daemon with jemalloc then run: > LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libjemalloc.so pm2 update You can then confirm that pm2 daemon is running with jemalloc by using: > ps -aux | grep pm2 #Use this to find the PID of the daemon
> cat /proc/PM2_DAEMON_PID/smaps | grep jemalloc Before the daemon would balloon and take all available memory in the server (256GB RAM) and eventually crash. Now daemon is hovering around 150-300MB of RAM. |
tldr: Before the daemon would balloon and take all available memory in the server (256GB RAM) and eventually crash. Now daemon is hovering around 150-300MB of RAM after forcing the daemon to use @rodrigograca31 can you check the above solution and report back if it works for you please. Maybe then I will make a PR to automatically detect and use jemalloc for the Daemon. |
I can't anymore... it fixed it self over time if you know what I mean 😂 |
@smashah how do i install |
Try this Then use
|
I stopped using PM2. Instead I'm using Linux native systemd without any memory or other issues. |
Indeed we determined that To fix this, simply disable pmx in your {
"apps": [
{
"name": "your-app-name",
"script": "app.js",
"exec_mode": "cluster_mode",
"wait_ready": true,
"instances": "max",
+ "pmx": false,
"env_production": {
"NODE_ENV": "production"
}
}, Then delete and restart all via References:
|
Possibly related #5216 |
I'm running PM2 5.3.0 with Node 18.15.0 on a Raspberry Pi... and top shows
After a little while, it the memory utilization takes over all available resources and the system crashes. How exactly do I fix this? It's not clear to me from reviewing this thread. I don't know what ecosystem.json is, does it apply to my configuration? |
@vicatcu set |
Thanks I've been using PM2 for a long time and ecosystem.json is new to me... I usually just do
|
Any update on this issue |
Just set |
The problem was the incompatibility of the Node.js and PM2 versions. Was It became And everything worked as it should |
it works |
pmx didn't help config |
#1126
I'm started experiencing this issue again after updating to 5.1.
It takes 230MB (~50%) of RAM on a 512MB machine
The text was updated successfully, but these errors were encountered: