Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potential memory leak on TiDB server #32289

Closed
coderplay opened this issue Feb 11, 2022 · 12 comments
Closed

Potential memory leak on TiDB server #32289

coderplay opened this issue Feb 11, 2022 · 12 comments

Comments

@coderplay
Copy link
Contributor

coderplay commented Feb 11, 2022

Bug Report

Please answer these questions before submitting your issue. Thanks!

I don't know how to reproduce it. We notice there is one TiDB server in our cluster with 7.9 GB memory usage, the rest TiDB servers are empty, and we haven't run queries on that cluster for quite a while. The show processlist returns nothing except the session we run this show command.

Here is the heap dump (click the image): heap dump

Any clues on that?

@coderplay coderplay added the type/bug This issue is a bug. label Feb 11, 2022
@aytrack aytrack added type/question and removed type/bug This issue is a bug. labels Feb 14, 2022
@XuHuaiyu
Copy link
Contributor

The memory usage is consumed by the internal background worker.
PTAL @chrysan

@chrysan
Copy link
Contributor

chrysan commented Feb 14, 2022

@zeminzhou what's your tidb_version and analyze_version?

@coderplay
Copy link
Contributor Author

coderplay commented Feb 14, 2022

what's your tidb_version and analyze_version?

tidb_version is 5.2.2, how can I get the analyze_version ?

Is that a dangling analyze?

@chrysan
Copy link
Contributor

chrysan commented Feb 14, 2022

what's your tidb_version and analyze_version?

tidb_version is 5.2.2, how can I get the analyze_version ?

Is that a dangling analyze?

show global variables like "%tidb_analyze_version%";

@chrysan
Copy link
Contributor

chrysan commented Feb 14, 2022

We enhanced the memory usage of analyze in v5.3 (with tidb_analyze_version=2 which is the default set). Please try with it, if you have to stick to v5.2.2, please set @@tidb_analyze_version=1 referring to this KB https://kb.pingcap.com/post/solutions/405 as a workaround.

@coderplay
Copy link
Contributor Author

Could you kindly educate me on how the analyze version caused the memory leakage?

@coderplay
Copy link
Contributor Author

@XuHuaiyu

The memory usage is consumed by the internal background worker.

Are you meaning this is a fixed memory cost for TiDB servers because it's a long-lived background worker? That doesn't explain why the other TiDB servers on the cluster don't have this part of memory footprints.

@chrysan
Copy link
Contributor

chrysan commented Feb 15, 2022

@coderplay the major memory cost is from analyze job, which is a scheduled background process, instead of long-living. Background analyze job is scheduled mainly based on 1. the ratio of modification on this table 2. if current time is within the window set.

For tidb-version < v5.1.0, the default analyze-version is 1.
From tidb-version v5.1.0, to enhance the statistics' accuracy, we provide analyze-version 2, while the new sampling algorithm consumes more memory at TiDB server side.
From tidb-version v5.3.0, we enhanced the sampling algorithm to reduce the TiDB side memory usage.
That's the reason of my above suggestion.

@coderplay
Copy link
Contributor Author

@chrysan Should be related to this: #29306

@chrysan
Copy link
Contributor

chrysan commented Feb 16, 2022

@chrysan Should be related to this: #29306

Yes, you are right. I missed it. If you see memory usage always high, it could be the memory leak issue. If the memory usage is sometimes high but sometimes drops, it could be the limitation of the old sampling algorithm.

Background auto-analyze jobs can be running but not shown in the results of show processlist. We are working to enhance this part.

You can upgrade to v5.3 to get stable analyze v2 or just workaround to use analyze v1.

@coderplay
Copy link
Contributor Author

Thanks for the confirmation. Since it's fixed, I will go ahead and close this issue.

@tiancaiamao
Copy link
Contributor

Duplicated by #32499

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants