Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

_fix_outputs for BatchNormalization #7719

Closed
wants to merge 5 commits into from

Conversation

liaopeiyuan
Copy link
Contributor

In newer versions of ONNX (e.g. torch.onnx outputs), BatchNormalization has multiple outputs when training mode is enabled. This is a quick fix similar to the one done for Dropout.

handling BatchNormalization in training mode.
@liaopeiyuan
Copy link
Contributor Author

@FrozenGene
Copy link
Member

Thanks for contributing @liaopeiyuan However, I think it is better to add one unit testing case. Do you think so?

@FrozenGene FrozenGene added the status: need test case need test cases to cover the change label Mar 22, 2021
@liaopeiyuan
Copy link
Contributor Author

yes, I will add it shortly.

sunggg added a commit to cmu-catalyst/collage that referenced this pull request Mar 25, 2021
sunggg added a commit to cmu-catalyst/collage that referenced this pull request Mar 25, 2021
@FrozenGene
Copy link
Member

Any update? @liaopeiyuan

@liaopeiyuan
Copy link
Contributor Author

Sorry, this is part of a larger patch, which is an on-going research project. We will add the test cases when we have time or are done with the internal patch.

@tqchen
Copy link
Member

tqchen commented Jun 4, 2021

gentle ping @liaopeiyuan

@liaopeiyuan
Copy link
Contributor Author

where should I place the newly added unit test?

@tqchen
Copy link
Member

tqchen commented Jun 4, 2021

@liaopeiyuan
Copy link
Contributor Author

liaopeiyuan commented Jun 6, 2021

Does the onnxruntime build on CI come with training mode enabled? Seems like the problem is onnxruntime being unhappy with batchnorm containing multiple outputs whiletraining_mode=False by default. And, taking a step back at this PR, what would be our ideal behavior for batchnorm if the user brings an ONNX model in training mode? Do we also include the extra mean and var in outputs, or do we throw them away for them? If the latter is what we want, then it doesn't make sense to write a unit test case since ONNX and TVM behavior will be different.

@jroesch
Copy link
Member

jroesch commented Jan 19, 2022

This PR appears to be out of date, please feel free to reopen it if this is not the case.

As part of the new year we are attempting to triage the project's open pull requests to ensure that code which
is ready for review and/or merging receives adequate attention.

Thanks again for your contribution, and feel free to reach out to discuss these changes.

@jroesch jroesch closed this Jan 19, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status: need test case need test cases to cover the change
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants