Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add bfloat16 type to a few ops missing it #3960

Merged
merged 4 commits into from
Jan 26, 2022
Merged

Conversation

gramalingam
Copy link
Contributor

@gramalingam gramalingam commented Jan 21, 2022

Description
Add bfloat16 type to a few ops missing it (LessOrEqual, GreaterOrEqual, Loop, If, Scan, LeakyRelu, PRelu).

See issue 3842: #3842

Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>
Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>
Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>
@gramalingam gramalingam requested review from a team as code owners January 21, 2022 23:42
@gramalingam gramalingam added this to the 1.11 milestone Jan 21, 2022
@gramalingam gramalingam added the operator Issues related to ONNX operators label Jan 21, 2022
Copy link
Member

@jcwchen jcwchen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks!

LeakyRelu takes input data (Tensor<T>) and an argument alpha, and produces one
output data (Tensor<T>) where the function `f(x) = alpha * x for x < 0`,
`f(x) = x for x >= 0`, is applied to the data tensor elementwise.

**History**
- Version 16 adds bfloat16 to the types allowed.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like providing a history section, which makes the change clearer. Just out of curiosity: Why only LeakyRelu and PRelu add history section?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good question! For the control-flow ops, especially, I was thinking about another issue: they have very long documentation, and we end up copy-pasting it when we update the op even though the documentation doesn't change much, which also presumably increases binary size etc. I was thinking a slightly better solution might be needed, if we wanted to avoid this kind of increase. Something to think about. Thanks!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh my goodness, a history section in the actual op would be so useful. I spend so much time digging through dozens of PR's each ONNX update and comparing Changelog.md entries to see what actually changed between ops for a given version.

Copy link
Contributor

@askhade askhade Jan 26, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@fdwr : I agree it is cumbersome to find out op updates. In short term you can refer to the release wiki pages that we create to track op validation. This one is from last release : https://github.com/onnx/onnx/wiki/Logistics-for-ONNX-Release-1.10#changelog
We will create a similar one for 1.11 soon. This table has a short description of what changed and link to onnx PR which made the update.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@askhade : 👍 That wiki looks nice. I read the generic notes on the Release page, but the wiki has more notes beyond just the PR title and a link.

@gramalingam gramalingam linked an issue Jan 26, 2022 that may be closed by this pull request
@askhade askhade enabled auto-merge (squash) January 26, 2022 19:40
@askhade askhade merged commit 50a1981 into onnx:main Jan 26, 2022
@gramalingam gramalingam deleted the bfloat-ops branch April 15, 2022 19:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
operator Issues related to ONNX operators
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

Add bfloat16 to all relevant ops
4 participants