New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ONNX] update default opset_version to 13 #73898
Conversation
CI Flow Status⚛️ CI FlowRuleset - Version:
|
🔗 Helpful links
💊 CI failures summary and remediationsAs of commit 74ec262 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
What's your opinion on |
I think we should restrict usage of diff tests to only those cases where we can't figure out meaningful explicit assertions about the expected output. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We have always overwritten tests with newer opsets, which is great for newer version, but we also loose coverage on older versions
Have you considered a way of having tests for different opset versions as opposed to replace them? Does it make sense? Maybe opset 7 for a certain op ha one behavior and for another a slightly different. Keeping both, we could make sure both are guarded
I don't know what you mean by this. The operator diff tests (that diff the output against the Most of the other tests (e.g. The operator diff tests are pretty annoying for many reasons and I think we should reduce their scope rather than (or before) expanding them to all op set versions. See discussion with Bowen above. I'm not sure I understood the concern, so does this comment address your concern? |
Yes, it does. My point was that for |
And add a new tool to update it in the future, which follows the policy of using "latest as of 18 months ago". This policy is meant to balance: * recent enough to increase the odds of being able to successfully export * old enough to increase the odds of exported model being runnable by different ONNX implementations Also minor clean-up of related code in symbolic_helper: * Remove a misleading comment * Remove unnecessary check in _set_opset_version * Use a range to define _onnx_stable_opsets
@malfet @msaroufim this needs import for the one-line change in test/quantization/eager/test_quantize_eager_ptq.py. |
Closed and reopen to trigger Github pipelines after conflict resolution for |
@msaroufim has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@malfet has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
Summary: And add a new tool to update it in the future, which follows the policy of using "latest as of 18 months ago". This policy is meant to balance: * recent enough to increase the odds of being able to successfully export * old enough to increase the odds of exported model being runnable by different ONNX implementations Related changes: * test_models.py: explicitly fix opset_version to 9 rather than relying on default. Caffe2 doesn't support newer versions. * symbolic_helper.py: * Remove a misleading comment * Remove unnecessary check in `_set_opset_version` * Use a range to define `_onnx_stable_opsets` * test_pytorch_common.py: * Rename a variable from min -> max. I think it was a copy-paste error. * Make skip test messages more informative. * Remove unused `skipIfONNXShapeInference`. More on that below. * test_pytorch_onnx_onnxruntime.py: * Make all the `TestCase` classes explicitly specify opset version. * Make `test_unsupported_pad` respect `opset_version` by using `run_test` * Unrelated simplification: make it obvious that all tests run with `onnx_shape_inference=True`. AFAICT this was already the case. * There was one test that was entirely disabled (test_tolist) because it was asking to be skipped whenever `onnx_shape_inference=True`, but it was always True. I changed the model being tested so as to preserve the intended test coverage but still have the test actually pass. X-link: pytorch/pytorch#73898 Reviewed By: msaroufim Differential Revision: D35264615 Pulled By: malfet fbshipit-source-id: cda8fbdffe4cc8210d8d96e659e3a9adf1b5f1d2
Summary: And add a new tool to update it in the future, which follows the policy of using "latest as of 18 months ago". This policy is meant to balance: * recent enough to increase the odds of being able to successfully export * old enough to increase the odds of exported model being runnable by different ONNX implementations Related changes: * test_models.py: explicitly fix opset_version to 9 rather than relying on default. Caffe2 doesn't support newer versions. * symbolic_helper.py: * Remove a misleading comment * Remove unnecessary check in `_set_opset_version` * Use a range to define `_onnx_stable_opsets` * test_pytorch_common.py: * Rename a variable from min -> max. I think it was a copy-paste error. * Make skip test messages more informative. * Remove unused `skipIfONNXShapeInference`. More on that below. * test_pytorch_onnx_onnxruntime.py: * Make all the `TestCase` classes explicitly specify opset version. * Make `test_unsupported_pad` respect `opset_version` by using `run_test` * Unrelated simplification: make it obvious that all tests run with `onnx_shape_inference=True`. AFAICT this was already the case. * There was one test that was entirely disabled (test_tolist) because it was asking to be skipped whenever `onnx_shape_inference=True`, but it was always True. I changed the model being tested so as to preserve the intended test coverage but still have the test actually pass. Pull Request resolved: #73898 Reviewed By: msaroufim Differential Revision: D35264615 Pulled By: malfet fbshipit-source-id: cda8fbdffe4cc8210d8d96e659e3a9adf1b5f1d2
Summary: As followup of D35264615 (b2ef1f3) (pytorch/pytorch#73898) Differential Revision: D35472745 fbshipit-source-id: 3ed9501088f22301a91c8d8e585557368ec225fa
Am sorry if I am asking in a wrong thread, recently Signal processing operators were supported by ONNX, it's available in opset-17. Any estimate on when it will become available in Torch so that i can invoke export API. So, according to the policy is it like i have to wait 18months for this ? |
@stonelazy please open an issue requesting support for the specific ATen / torch op that you need with an example model. |
And add a new tool to update it in the future, which follows the policy
of using "latest as of 18 months ago". This policy is meant to balance:
export
different ONNX implementations
Related changes:
_set_opset_version
_onnx_stable_opsets
skipIfONNXShapeInference
. More on that below.TestCase
classes explicitly specify opset version.test_unsupported_pad
respectopset_version
by usingrun_test
onnx_shape_inference=True
. AFAICT this was already the case.onnx_shape_inference=True
, but it was always True. I changed the model being tested so as to preserve the intended test coverage but still have the test actually pass.