Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Conversation

@natuan
Copy link
Contributor

@natuan natuan commented Feb 7, 2022

Putting back the order of input's keys passing into the pytorch onnx export

@natuan natuan requested review from a team, bfineran, markurtz and mgoin February 7, 2022 19:16
bfineran
bfineran previously approved these changes Feb 7, 2022
Copy link
Member

@markurtz markurtz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@natuan need to regenerate the input_shapes arg based on the new inputs that will be passed through so we log properly. Additionally, can we add a warning in the event that the keys in the inputs do not match the keys for the forwards.

@natuan
Copy link
Contributor Author

natuan commented Feb 7, 2022

@natuan need to regenerate the input_shapes arg based on the new inputs that will be passed through so we log properly. Additionally, can we add a warning in the event that the keys in the inputs do not match the keys for the forwards.

I fixed the first one you mentioned. For the warning, the input output by the tokenizer is a Dict so by issuing that warning we imply an ordering over it right? (which is what this fix is) In the pytorch/utils/exporter we did have this type of warning if the input batch is a Dict but not an OrderedDict

@markurtz
Copy link
Member

markurtz commented Feb 7, 2022

@natuan need to regenerate the input_shapes arg based on the new inputs that will be passed through so we log properly. Additionally, can we add a warning in the event that the keys in the inputs do not match the keys for the forwards.

I fixed the first one you mentioned. For the warning, the input output by the tokenizer is a Dict so by issuing that warning we imply an ordering over it right? (which is what this fix is) In the pytorch/utils/exporter we did have this type of warning if the input batch is a Dict but not an OrderedDict

I was mainly concerned that we are now technically filtering out inputs based on whether or not they are defined in the forward function. This is something that might be a good thing to do, but we should have a warning in place to say that the inputs were filtered and not everything was passed through. On the other side, if not all inputs are found that are defined in the forward, it would be good to have this called out as well in the event that some have defaults and therefore would not fail the forward

@natuan
Copy link
Contributor Author

natuan commented Feb 8, 2022

@natuan need to regenerate the input_shapes arg based on the new inputs that will be passed through so we log properly. Additionally, can we add a warning in the event that the keys in the inputs do not match the keys for the forwards.

I fixed the first one you mentioned. For the warning, the input output by the tokenizer is a Dict so by issuing that warning we imply an ordering over it right? (which is what this fix is) In the pytorch/utils/exporter we did have this type of warning if the input batch is a Dict but not an OrderedDict

I was mainly concerned that we are now technically filtering out inputs based on whether or not they are defined in the forward function. This is something that might be a good thing to do, but we should have a warning in place to say that the inputs were filtered and not everything was passed through. On the other side, if not all inputs are found that are defined in the forward, it would be good to have this called out as well in the event that some have defaults and therefore would not fail the forward

Put in a warning

@markurtz markurtz merged commit 12ffdae into main Feb 8, 2022
@natuan natuan deleted the export_fix branch February 8, 2022 19:57
bfineran pushed a commit that referenced this pull request Feb 9, 2022
* Enforce order on input keys to export

* Warn if input dropped from onnx export
bfineran added a commit that referenced this pull request Feb 9, 2022
* Update README.md for transformers to note the quantization conversion issue (#539)

* Update README.md

* Update integrations/huggingface-transformers/README.md

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Enforce order on input keys to export (#545)

* Enforce order on input keys to export

* Warn if input dropped from onnx export

* Restrict mistune version to fix docs build (#547)

* quantization fixes for transformers flows (#548)

* quantization fixes for transformers flows

* match on class name instead

* quality

* set release branch version to 0.10.1

* Revert "Update README.md for transformers to note the quantization conversion issue (#539)"

This reverts commit 9304997.

Co-authored-by: Mark Kurtz <mark@neuralmagic.com>
Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>
Co-authored-by: Tuan Nguyen <tuan@neuralmagic.com>
Co-authored-by: Michael Goin <michael@neuralmagic.com>
KSGulin pushed a commit that referenced this pull request Feb 15, 2022
* Enforce order on input keys to export

* Warn if input dropped from onnx export
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants