Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Passenger Core increasing memory usage #1728

Closed
OnixGH opened this issue Jan 21, 2016 · 17 comments
Closed

Passenger Core increasing memory usage #1728

OnixGH opened this issue Jan 21, 2016 · 17 comments
Milestone

Comments

@OnixGH
Copy link
Contributor

OnixGH commented Jan 21, 2016

Reported on the forum.

The Passenger Core process shows increasing mem usage of 5-8G daily:
after 20 hours: graph | passenger-memory-stats | passenger-status --show=server
after 4 days: graph
another run: passenger-status

Passenger version 5.0.21
Rails 2.2, running in single threaded mode
400 Rails instances
80,000 RPM Average throughput, with peaks reaching 110,000 RPM.
Ubuntu 14.04 on a bare-metal server (20 cores w/ hyperthreading)

Passenger max. queue size is set to 80.000, max requests to 100.000. This is the same server as where #1726 is occurring (Error 11 while connecting to upstream).

@gdeglin
Copy link

gdeglin commented Jan 27, 2016

Disabling max-requests didn't seem to fix this.

We are also noticing that average response times go up dramatically along with memory growth: graph

@jwilm
Copy link

jwilm commented Feb 25, 2016

We haven't seen #1726 since updating our passenger to the version with configurable backlog. Has there been any progress on this issue? Is there anything information we can provide to facilitate its resolution? We are now reloading nginx/passenger daily do deal with the PassengerCore memory issue.

@FooBarWidget
Copy link
Member

So far we haven't found any leads on where this growing memory usage comes from. There are a few places of suspect (e.g. the queue between the AcceptLoadBalancer and the core controllers) but they don't seem to be significant enough to explain the kind of memory growth you're seeing. We're continuing to run tests.

@FooBarWidget
Copy link
Member

Having a reproducible case would help us a lot.

@OnixGH OnixGH added this to the 5.0.27 milestone Mar 21, 2016
@OnixGH
Copy link
Contributor Author

OnixGH commented Mar 21, 2016

@jwilm we've just found and fixed a memory leak in commit fe87178. We're not sure if it's related at all, but it occurs when dealing with (relatively) large request or response bodies on a loaded server (when the internal Passenger memory buffer overflows to disk).

The fix will be part of 5.0.27, but we just wanted to give you a heads up in case you would like to try it already (it's a pretty tiny patch).

@jwilm
Copy link

jwilm commented Mar 21, 2016

Thanks for the heads up; we'll update once it lands.

@OnixGH
Copy link
Contributor Author

OnixGH commented Apr 7, 2016

@jwilm 5.0.27 is out, can you re-check?

@FooBarWidget
Copy link
Member

We have found (and fixed) another memory leak: #1797

This one is bigger but only triggers when there are more than 1024 concurrent requests. Maybe this is the leak that was affecting you?

@jwilm
Copy link

jwilm commented Apr 14, 2016

That sounds likely! Whenever we add a new server, it doesn't exhibit the problem until it's under the same load as the rest of our servers. Each of them handles ~ 60k RPM (using fewer, less powerful servers than when this issue was originally filed), and the traffic comes in bursts.

@jwilm
Copy link

jwilm commented Apr 17, 2016

When do you expect the latest memory leak patch to land in the ubuntu passenger ppa? We're very excited to try it out!

@OnixGH
Copy link
Contributor Author

OnixGH commented Apr 20, 2016

@jwilm We're trying to get 5.0.28 out next week.

@OnixGH OnixGH modified the milestones: 5.0.28, 5.0.27 Apr 20, 2016
@OnixGH
Copy link
Contributor Author

OnixGH commented Apr 28, 2016

@jwilm 5.0.28 has been released, can you check it out? :)

@jwilm
Copy link

jwilm commented Apr 28, 2016

Huzzah! I'll upgrade our front-end servers this afternoon. We're passed our high traffic point for today; providing a report will need to wait until tomorrow or the next day at the earliest.

Thanks for getting that out!

@jwilm
Copy link

jwilm commented Apr 28, 2016

Our servers have been upgraded. I'll let you know how it goes in a day or two!

@jwilm
Copy link

jwilm commented Apr 29, 2016

Here's memory usage on one of our frontend servers for the last 24 hours:

memory last 24 hours

You can see the leak causing increased usage leading up to when we upgraded. After the upgrade, memory usage has been relatively flat.

Here's the PassengerAgent memory usage for the same period:

screen shot 2016-04-29 at 11 44 28 am

and for the period after the upgrade:

screen shot 2016-04-29 at 11 45 25 am

The major memory leak seems to be resolved, and we are quite happy with how passenger has performed since the fix. Given the last graph, it appears there may still be a minor leak (~30mb over 20 hours), but it's small enough to not be an issue.

Thanks for tracking this down and getting a patch out!

@OnixGH
Copy link
Contributor Author

OnixGH commented Apr 30, 2016

@jwilm great to see the big leak plugged! Would it be possible for you to check once more a week from now to see how the (minor) increased memory usage develops?

If there is still a leak somewhere we'd like to open a new issue for that and hunt it down as well.

@OnixGH
Copy link
Contributor Author

OnixGH commented Jun 17, 2016

@jwilm can you provide us with some final feedback on this? How is the behavior across multiple days/weeks?

I'll go ahead and close it already since the major leak was indeed fixed in 5.0.28.

@OnixGH OnixGH closed this as completed Jun 17, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants