Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Tracking] ONNX 1.9 work items #3225

Closed
askhade opened this issue Jan 19, 2021 · 10 comments
Closed

[Tracking] ONNX 1.9 work items #3225

askhade opened this issue Jan 19, 2021 · 10 comments
Milestone

Comments

@askhade
Copy link
Contributor

askhade commented Jan 19, 2021

Creating this issue to collect and track all the work items for ONNX 1.9 release.
The release is planned for March 2021.

@onnx/sig-archinfra , @onnx/sig-operators , @onnx/steering-committee , @sveta-levitan, @wschin , @chinhuang007 and community members, please add work items here so that we can scope the release.

Thanks!

Currently planned workitems for 1.9

  1. Removing Optimizers from onnx packages @daquexian
  2. Updates to checker and shape inference exception handling @askhade
  3. Selective schema loading @jcwchen
  4. Opset 14 updates
  5. Updates to external data helpers (add more options to control which tensors are serialized, add a new API to convert and serialize model in 1 step as opposed to the 2 steps required now) @annajung
  6. Enhance ONNX type-checker: currently it does not check type-constraints specified in the op-schema spec @jcwchen
  7. Enhance the sparse-tensor constants to allow block-sparse format? <if time permits; needs approval from community>
  8. Should we migrate sparse-tensor type from ONNXML to ONNX?
@spandantiwari
Copy link
Contributor

For opset 14, two possibilities to consider are:

  1. Adding trilu op to spec.
  2. Standardize specification of optional inputs/outputs.

@askhade
Copy link
Contributor Author

askhade commented Jan 19, 2021

@spandantiwari : Thanks for the inputs. Can you elaborate more on your 2nd bullet... I don't understand what is the exact work that needs to go in for this item.

@gramalingam
Copy link
Contributor

Some possible enhancements (though not sure whether it should be in 1.9; depends on workload and prioritization).

  1. Enhance ONNX type-checker: currently it does not check type-constraints specified in the op-schema spec (unlike ORT).
  2. Enhance the sparse-tensor constants to allow block-sparse format?
  3. Should we migrate sparse-tensor type from ONNXML to ONNX?

@jcwchen
Copy link
Member

jcwchen commented Jan 21, 2021

There is still a IR gap issue for version_converter: #2873. Anyone interested is welcome.

yan12125 pushed a commit to EMCLab-Sinica/Stateful-CNN that referenced this issue Jan 26, 2021
onnx.optimizer is going to be removed in March 2021 [1]. I may not have
time to finish onnxoptimizer package before then.

[1] onnx/onnx#3225
@daquexian
Copy link
Member

Is there any plan to support python 3.9 in onnx 1.9 release?

@neginraoof
Copy link
Contributor

Adding Trilu operator: #3291

@BowenBao
Copy link
Contributor

Extending type support for math ops(Add, Sub, Mul, Div) #3334 (for HuggingFace/transfo_xl)

@jcwchen
Copy link
Member

jcwchen commented Mar 16, 2021

@daquexian According to the discussion with release manger of ONNX 1.9, Python 3.9 support is in the plan. Thanks.

@postrational postrational added this to the 1.9 milestone Mar 29, 2021
@etusien
Copy link

etusien commented Mar 31, 2021

Key Updates

Opset version 14

API

Infrastructure

Bug fixes

Notes

  • Be aware of protobuf version gap issue (like building onnx with protobuf>=3.12 is not compatible with older protobuf)

@etusien
Copy link

etusien commented Mar 31, 2021

In the comment above, I proposed the notes regarding the upcoming 1.9 release. Please let me know about any mistakes or missing scope items. Thanks!

@askhade askhade closed this as completed Jun 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants