Skip to content

Refactor dynamo converter and ir.Model support #1765

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 31 commits into
base: main
Choose a base branch
from

Conversation

justinchuby
Copy link
Contributor

@justinchuby justinchuby commented Apr 18, 2025

Describe your changes

  • Avoid redundant disk IO when saving a dynamo exported model by saving the ir.Model directly
  • Refactor torch.onnx conversion passes

Checklist before requesting a review

  • Add unit tests for this change.
  • Make sure all tests can pass.
  • Update documents if necessary.
  • Lint and apply fixes to your code by running lintrunner -a
  • Is this a user-facing change? If yes, give a description of this change to be included in the release notes.
  • Is this PR including examples changes? If yes, please remember to update example documentation in a follow-up PR.

@justinchuby justinchuby requested review from titaiwangms, Copilot and jambayk and removed request for titaiwangms April 18, 2025 00:52
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR refactors the model conversion flow to avoid redundant disk IO when saving a Dynamo exported model and enables direct saving of ONNX IR models by introducing the ir_model_to_olive_model function.

  • Updated onnxscript_fusion.py to replace model_proto_to_olive_model with ir_model_to_olive_model.
  • Refactored conversion.py to add separate export methods for torchscript and dynamo exporters.
  • Added ir_model_to_olive_model in onnx/common.py to support saving IR models.

Reviewed Changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 1 comment.

File Description
olive/passes/onnx/onnxscript_fusion.py Updated the model conversion call to use ir_model_to_olive_model.
olive/passes/onnx/conversion.py Refactored export methods and introduced dynamo exporter changes.
olive/passes/onnx/common.py Added the new function ir_model_to_olive_model for saving IR models.

@justinchuby
Copy link
Contributor Author

justinchuby commented Apr 22, 2025

@titaiwangms
Copy link
Contributor

https://aiinfra.visualstudio.com/PublicPackages/_build/results?buildId=758867&view=logs&j=168020a6-ab1b-592e-b9e6-a6cd0eb1939c&t=f9534d71-ad75-594d-34df-0afc963a2a4d&l=10801

@titaiwangms do you have an idea about this error in _rename_dynamic_shapes_with_model_inputs?

Yes, it's a bug and the function will be deleted in 2.7. Because I didn't consider the mismatch input names between onnx graph and nn.Module, it causes KeyError to the matching. (That's why I set it to 2.7)

@justinchuby justinchuby marked this pull request as ready for review April 22, 2025 17:26
@justinchuby justinchuby requested a review from Copilot April 22, 2025 17:26
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR refactors the dynamo exporter flow to avoid redundant disk IO and updates the torch.onnx conversion passes while improving type annotations and generic typing.

  • Update test conditions to account for torch version 2.7.0 and dynamic shape handling.
  • Refactor type hints in IoConfig by marking several fields as Optional.
  • Enhance generic typing in config_utils with a type variable.

Reviewed Changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated no comments.

File Description
test/unit_test/passes/onnx/test_conversion.py Adjusted test skip conditions for Windows and dynamic shapes based on torch version.
olive/model/config/io_config.py Updated type annotations to use Optional for various input/output fields.
olive/common/config_utils.py Modified the config union type to use a generic type variable T.
Comments suppressed due to low confidence (1)

olive/common/config_utils.py:317

  • Ensure that the type variable T is properly defined (e.g., via 'from typing import TypeVar' and 'T = TypeVar("T")') to support the generic type annotation.
config: Union[dict[str, Any], T, None],

@justinchuby justinchuby requested review from jambayk and Copilot April 22, 2025 20:13
Copilot

This comment was marked as outdated.

@justinchuby justinchuby enabled auto-merge (squash) April 23, 2025 00:23
@justinchuby justinchuby disabled auto-merge April 29, 2025 23:02
@titaiwangms
Copy link
Contributor

@justinchuby @jambayk Are we merging this soon? I am working on a PR to refactor the patch and dynamic_shapes to the latest version for better support on LLMs.

@justinchuby
Copy link
Contributor Author

I think the PR is ready. I would need some reviews

@justinchuby justinchuby requested a review from Copilot June 12, 2025 20:37
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR refactors the dynamo exporter and onnx conversion passes by reducing redundant disk IO and updating type annotations.

  • Updated test conditions for dynamo export based on torch versions
  • Changed type annotations in IoConfig to use Optional
  • Updated generic type hints in config_utils

Reviewed Changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated 1 comment.

File Description
test/unit_test/passes/onnx/test_conversion.py Adjusted test conditions with additional torch version checks
olive/model/config/io_config.py Updated type hints to use Optional for better type clarity and safety
olive/common/config_utils.py Modified function signature to use generic type T for configuration input
Comments suppressed due to low confidence (2)

test/unit_test/passes/onnx/test_conversion.py:36

  • [nitpick] Consider adding an inline comment to explain why skipping is conditioned on the torch version for Windows platforms, which can help future maintainers understand the rationale.
if platform.system() == "Windows" and use_dynamo_exporter and _torch_is_older_than("2.7.0"):

olive/model/config/io_config.py:37

  • [nitpick] Updating type annotations to Optional is a good improvement. It would be helpful to update the corresponding docstrings to reflect these changes.
input_shapes: Optional[list[list[int]]] = None

@justinchuby
Copy link
Contributor Author

=================================== FAILURES ===================================
____________________________ test_inc_quantization _____________________________
test/unit_test/passes/inc/test_inc_quantization.py:43: in test_inc_quantization
quantized_model = p.run(ov_model, output_folder)
/passes/_pass.py:242: in run
output_model = self._run_for_config(model, self.config, output_model_path)
***/passes/onnx/inc_quantization.py:536: in _run_for_config
if q_model.is_large_model:
E AttributeError: 'NoneType' object has no attribute 'is_large_model'

@@ -531,13 +531,17 @@ def _run_for_config(
"find any quantized model which meet accuracy goal. "
"Try to increase 'max_trials' in 'tuning_criterion'."
)
if q_model is None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks! The changes look good to me. Was waiting for the //build release to be stabilized before we merged this. There's a failing test for this though

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes I need to figure that out. Somehow the ir_model_to_olive_model method seems to be missing data. Do you have an idea? Otherwise I will do some more debugging.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i am not familiar with this failure

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants