-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add bfloat16 type to a few ops missing it #3960
Conversation
Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>
Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>
Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thanks!
LeakyRelu takes input data (Tensor<T>) and an argument alpha, and produces one | ||
output data (Tensor<T>) where the function `f(x) = alpha * x for x < 0`, | ||
`f(x) = x for x >= 0`, is applied to the data tensor elementwise. | ||
|
||
**History** | ||
- Version 16 adds bfloat16 to the types allowed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like providing a history section, which makes the change clearer. Just out of curiosity: Why only LeakyRelu and PRelu add history section?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good question! For the control-flow ops, especially, I was thinking about another issue: they have very long documentation, and we end up copy-pasting it when we update the op even though the documentation doesn't change much, which also presumably increases binary size etc. I was thinking a slightly better solution might be needed, if we wanted to avoid this kind of increase. Something to think about. Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh my goodness, a history section in the actual op would be so useful. I spend so much time digging through dozens of PR's each ONNX update and comparing Changelog.md entries to see what actually changed between ops for a given version.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@fdwr : I agree it is cumbersome to find out op updates. In short term you can refer to the release wiki pages that we create to track op validation. This one is from last release : https://github.com/onnx/onnx/wiki/Logistics-for-ONNX-Release-1.10#changelog
We will create a similar one for 1.11 soon. This table has a short description of what changed and link to onnx PR which made the update.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@askhade : 👍 That wiki looks nice. I read the generic notes on the Release page, but the wiki has more notes beyond just the PR title and a link.
Description
Add bfloat16 type to a few ops missing it (LessOrEqual, GreaterOrEqual, Loop, If, Scan, LeakyRelu, PRelu).
See issue 3842: #3842