Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Monitors RU consumption to maximize short-term usage of MAX RU. #7908

Closed
nolouch opened this issue Mar 12, 2024 · 1 comment · Fixed by #7936
Closed

Monitors RU consumption to maximize short-term usage of MAX RU. #7908

nolouch opened this issue Mar 12, 2024 · 1 comment · Fixed by #7936
Labels
affects-7.1 affects-7.5 report/customer Customers have encountered this bug. severity/minor The issue's severity is minor. type/bug The issue is confirmed as a bug. type/development The issue belongs to a development tasks

Comments

@nolouch
Copy link
Contributor

nolouch commented Mar 12, 2024

Problem

I have a group:
image

When I only run workload A:

sysbench --mysql-user=user1 --mysql-host=10.2.12.53 --mysql-port=32335 --table-size=10000000  oltp_read_only run --time=1000 --report-interval=1 --threads=4  --mysql-db=test --tables=32 --rate=2

RU Avg is OK because the workload is stable.
image

But When I manually run the big query SQL like:

select count(*) from test.sbtest1;

The monitoring was not very accurate(on-avg), which led me to mistakenly believe that I was far away from triggering RC control.
image

But from the slow query:
image
it cost 8000+ RU, and which let some query need wait in the RC queue

Development Task

enhance the observability.

@nolouch nolouch added the type/development The issue belongs to a development tasks label Mar 12, 2024
@nolouch nolouch added type/bug The issue is confirmed as a bug. affects-7.1 affects-7.5 severity/minor The issue's severity is minor. labels Mar 18, 2024
@github-actions github-actions bot added this to Need Triage in Questions and Bug Reports Mar 18, 2024
ti-chi-bot bot added a commit that referenced this issue Mar 21, 2024
close #7908

resource_manager: record the max RU per second

Signed-off-by: nolouch <nolouch@gmail.com>

Co-authored-by: ti-chi-bot[bot] <108142056+ti-chi-bot[bot]@users.noreply.github.com>
Questions and Bug Reports automation moved this from Need Triage to Closed Mar 21, 2024
ti-chi-bot pushed a commit to ti-chi-bot/pd that referenced this issue Apr 1, 2024
close tikv#7908

Signed-off-by: ti-chi-bot <ti-community-prow-bot@tidb.io>
ti-chi-bot bot added a commit that referenced this issue Apr 3, 2024
close #7908

resource_manager: record the max RU per second

Signed-off-by: nolouch <nolouch@gmail.com>

Co-authored-by: nolouch <nolouch@gmail.com>
Co-authored-by: ti-chi-bot[bot] <108142056+ti-chi-bot[bot]@users.noreply.github.com>
ti-chi-bot bot pushed a commit that referenced this issue Apr 11, 2024
close #7908

resource_manager: record the max RU per second

Signed-off-by: ti-chi-bot <ti-community-prow-bot@tidb.io>
Signed-off-by: nolouch <nolouch@gmail.com>

Co-authored-by: ShuNing <nolouch@gmail.com>
Co-authored-by: nolouch <nolouch@gmail.com>
@seiya-annie
Copy link

/found customer

@ti-chi-bot ti-chi-bot bot added the report/customer Customers have encountered this bug. label Jun 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
affects-7.1 affects-7.5 report/customer Customers have encountered this bug. severity/minor The issue's severity is minor. type/bug The issue is confirmed as a bug. type/development The issue belongs to a development tasks
Projects
Development

Successfully merging a pull request may close this issue.

2 participants