-
Notifications
You must be signed in to change notification settings - Fork 331
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GQLExecutor memory leak #9262
Comments
Labelling it P3 as it's not critical now. Could be related to #8414 |
@rafaelromcar-parabol This could be an issue if we don't release for some time, like over christmas |
We should keep an eye on it, for sure. But keep in mind that if the containers arrive to 4 GB of memory, they will be automatically restarted. That said, we could restart the application next week before Christmas if needed. During the week or on the weekend. It can be done easily using the last pipeline in the Gitlab repository. |
Also, I think this should be a P2. Mem leaks should always be fixed asap as they can affect stability. Even more in our application, that doesn't like restarting components very much. At least it didn't before this. Let's see after the release this week. |
Blocked behind #9282 |
I was checking the memory usage on the GQL Ex and they are fine but at more than 60%. Which is fine, because if we do not release, the worst that can happen is having Kubernetes restarting them if they reach 100%. It isn't a big problem. Just sharing my thoughts here. |
true for the saas! for certain clients who do do not manage their infra as diligently as we do, restarts may pose a problem 😅 |
Of course! I was talking about the Christmas period for the SaaS 😸 PPMIs haven't been updated in a long while. They are still in 6.83.0 and they have been like that for a while. |
Prioritized to backlog |
Stale issue |
The GQLExecutor memory consumption goes steadily up. As seen on here 🔒(top three lines)
the memory usage stayed high even through the weekend where I would expect it to drop.
The text was updated successfully, but these errors were encountered: