New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Avoid overriding model_type in TasksManager #1647
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
May I start testing with sub package like neuron?
@JingyaHuang Sure, feel free to test so that this PR does not break too many things. What I expect to break is the usage of e.g. you would need |
AFAIK, so far the register is only used under neuron. @fxmarty |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR, left some questions and nits. Testing for neuron as well, will update you in a bit.
@@ -398,7 +402,13 @@ def main_export( | |||
"Could not infer the pad token id, which is needed in this case, please provide it with the --pad_token_id argument" | |||
) | |||
|
|||
model_type = "stable-diffusion" if "stable-diffusion" in task else model.config.model_type.replace("_", "-") | |||
if "stable-diffusion" in task: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about stable-diffusion-xl
don't we need an extra case for it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @echarlaix
@@ -366,6 +370,7 @@ def get_stable_diffusion_models_for_export( | |||
onnx_config_constructor = TasksManager.get_exporter_config_constructor( | |||
model=pipeline.text_encoder_2, | |||
exporter="onnx", | |||
library_name="diffusers", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same for text encoder 2: https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0/blob/main/model_index.json
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks for the PR!
Inference bugged due to the changes in infer_library_from_model
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, let's get it merged once the CIs are good! Thanks for iterating the issues occurs for subpackages @fxmarty !
Any update on this? 👀 |
Merging as failing tests are unrelated & fixed in #1683 & microsoft/onnxruntime#19421 |
* avoid modifying model_type * cleanup * fix test * fix test * fix library detection local model * fix merge * make library_name non-optional * fix warning * trigger ci * fix library detection
This is bad practice as
PretrainedConfig.model_type
is a class attribute.We also dissociate
_SUPPORTED_MODEL_TYPE
by library, as suggested by @JingyaHuang.Both
and
work as expected.
Still need to specify
transformers
in theORTModel
withexport=True
(we do not support sentence-transformers).