Skip to content

Auto Optimizer #1801

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Auto Optimizer #1801

wants to merge 1 commit into from

Conversation

shaahji
Copy link
Contributor

@shaahji shaahji commented Apr 28, 2025

Auto Optimizer

Redo auto-optimizer logic to simplify and use search engine.
Also, update the auto-opt CLI command to use AutoOptimizer.

Checklist before requesting a review

  • Add unit tests for this change.
  • Make sure all tests can pass.
  • Update documents if necessary.
  • Lint and apply fixes to your code by running lintrunner -a
  • Is this a user-facing change? If yes, give a description of this change to be included in the release notes.
  • Is this PR including examples changes? If yes, please remember to update example documentation in a follow-up PR.

(Optional) Issue link

Copy link
Contributor

@devang-ml devang-ml left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's quickly make these changes.

@shaahji shaahji force-pushed the shaahji/autoopt branch 2 times, most recently from c280226 to 6dc0579 Compare May 5, 2025 12:07
@shaahji shaahji force-pushed the shaahji/autoopt branch 2 times, most recently from 4b48cf0 to a2ab5b8 Compare May 5, 2025 20:25
@shaahji shaahji changed the title Auto Optimizer - Work In Progress Auto Optimizer May 5, 2025
@shaahji shaahji marked this pull request as ready for review May 5, 2025 20:26
@shaahji shaahji force-pushed the shaahji/autoopt branch 2 times, most recently from 7e093d2 to 393a904 Compare May 6, 2025 19:46
"use_model_builder": true,
"train_data_config": "wikitext2_train",
"calibration_data_config": "transformer_token_dummy_data",
"accelerator": { "accelerator_type": "cpu", "execution_provider": "CPUExecutionProvider" },
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we just use the accelerator from the system ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Olive config can have multiple system defined especially for cases where passes can use a separate host. That would make choosing the accelerator ambiguous.

return False

if accelerator_spec.execution_provider == "QNNExecutionProvider":
logger.info("QNNExecutionProvider doesn't support optimized model.")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since peephole optimizations are EP independent, why do we need this check?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is carry forward from current auto-opt cli implementation.

Olive/olive/cli/auto_opt.py

Lines 320 to 322 in 59bfe00

# qnn ep might not supported optimized model
# will re-enable it if needed in the future
passes_to_remove.update(["transformer_optimizer", "peephole_optimizer"])

@shaahji shaahji force-pushed the shaahji/autoopt branch 7 times, most recently from 0a7a890 to 3d7aaee Compare May 9, 2025 19:56
Redo auto-optimizer logic to simplify and use search engine.
@shaahji shaahji force-pushed the shaahji/autoopt branch from 3d7aaee to 00e9e5f Compare June 2, 2025 19:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants