Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature idea: perf analyzer behavior control #808

Open
Talavig opened this issue Jan 7, 2024 · 13 comments · May be fixed by #814
Open

Feature idea: perf analyzer behavior control #808

Talavig opened this issue Jan 7, 2024 · 13 comments · May be fixed by #814
Labels
enhancement New feature or request

Comments

@Talavig
Copy link

Talavig commented Jan 7, 2024

We would like to suggest an idea we thought about and we do not think is currently available in model analyzer or perf analyzers:
When using model analyzer, we want to replicate the behavior of the prod environment when testing. For example, we want to send a lot of requests at once (a local spike of usage), or a gradual, more controlled increased in requests in specific timestamps, or any other pattern of usage.
We would like to propose the following solution: add a field in the config file with that can take on of two kinds of values: either keywords that represent different usage behaviors like spikes or gradual increase, or a path to a file. This file can contain a json generated by the customer with specific "directions" to the wanted behavior. For example, for 3 seconds send x request and then send a spike of requests.
Is such a feature possible to implement? It can be very valuable to us.

@tgerdesnv
Copy link
Collaborator

tgerdesnv commented Jan 11, 2024

I do believe that Perf Analyzer supports something similar to what you are asking for via custom interval mode:
https://github.com/triton-inference-server/client/blob/main/src/c%2B%2B/perf_analyzer/docs/inference_load_modes.md#custom-interval-mode

@Talavig
Copy link
Author

Talavig commented Jan 15, 2024

We tried using this option through model analyzer, and got the following error:

Cannot use --concurrency-range or --request-rate-range with --request-intervals.

We added it to the config.yaml in the following way:

perf_analyzer_flags:
  {request-intervals: ./time_intervals.txt}

Do you have any idea why this may happen?

@tgerdesnv
Copy link
Collaborator

That is likely a bug :(. MA generally specifies the concurrency with every request and probably needs to be updated to not specify it when request-intervals are specified. I don't think it will be hard to fix. I'll try to reproduce and fix today.

@tgerdesnv
Copy link
Collaborator

@Talavig Are you running into this from a brute search? Or quick search?

@Talavig
Copy link
Author

Talavig commented Jan 17, 2024

We have originally encountered it using brute search, but we have tried using quick search as well and encountered the same error.

@tgerdesnv tgerdesnv linked a pull request Jan 17, 2024 that will close this issue
@dyastremsky dyastremsky added the enhancement New feature or request label Mar 19, 2024
@eladamittai
Copy link

Hello, I'm currently experiencing the same issue with model analyzer 24.02. Is there an update on the progress or a certain thing I need to do in the yaml to enable it?

@YaliEkstein
Copy link

That is likely a bug :(. MA generally specifies the concurrency with every request and probably needs to be updated to not specify it when request-intervals are specified. I don't think it will be hard to fix. I'll try to reproduce and fix today.

Hey, I as well try to use it and the bug still exists. Is there any news on an update or a solve to this issue?

@tgerdesnv
Copy link
Collaborator

I am going to try to find time to get this fixed this week

@eladamittai
Copy link

That will be awesome! Thanks for the update! Can you tell whether it'll come out as part of the 24.03 image?

@tgerdesnv
Copy link
Collaborator

It will definitely not be 24.03. That being said, once I get a fix in you can always work with the main branch if you want it asap:
https://github.com/triton-inference-server/model_analyzer/blob/main/docs/install.md#alternative-installation-methods

@eladamittai
Copy link

Great! Thank you so much!

@tgerdesnv
Copy link
Collaborator

This work is still in-progress

@eladamittai
Copy link

Hey, is there an update?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Development

Successfully merging a pull request may close this issue.

5 participants