-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
_fix_outputs for BatchNormalization #7719
Conversation
handling BatchNormalization in training mode.
Thanks for contributing @liaopeiyuan However, I think it is better to add one unit testing case. Do you think so? |
yes, I will add it shortly. |
Any update? @liaopeiyuan |
Sorry, this is part of a larger patch, which is an on-going research project. We will add the test cases when we have time or are done with the internal patch. |
gentle ping @liaopeiyuan |
where should I place the newly added unit test? |
Does the onnxruntime build on CI come with training mode enabled? Seems like the problem is onnxruntime being unhappy with batchnorm containing multiple outputs while |
This PR appears to be out of date, please feel free to reopen it if this is not the case. As part of the new year we are attempting to triage the project's open pull requests to ensure that code which Thanks again for your contribution, and feel free to reach out to discuss these changes. |
In newer versions of ONNX (e.g. torch.onnx outputs), BatchNormalization has multiple outputs when training mode is enabled. This is a quick fix similar to the one done for Dropout.