-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Generalize MinMax monotonic optimizer #25330
Generalize MinMax monotonic optimizer #25330
Conversation
This PR adds `Asin`, `Atan`, `Softsign` and `Softplus` to the collection of elementwise monotonic operations. This allows the arithmetic optimizer to optimize more cases. Continuation of tensorflow#25330
PiperOrigin-RevId: 231880254
This PR adds `Acos`, `Acosh`, `Asin`, `Atan`, `QuantizedRelu`, `QuantizedRelu6`, `QuantizedReluX`, `Softsign` and `Softplus` to the collection of elementwise monotonic operations. This allows the arithmetic optimizer to optimize more cases. Continuation of tensorflow#25330
This PR adds `Acos`, `Acosh`, `Asin`, `Atan`, `Softsign` and `Softplus` to the collection of elementwise monotonic operations. This allows the arithmetic optimizer to optimize more cases. Continuation of tensorflow#25330
Thanks for the fast merge! You might want to check out #25332 too, which adds support for more monotonic ops. |
This PR adds support for max pooling operations to the algorithmic optimizer for monotonic functions. This follows up on tensorflow#25330 /cc @ezhulenev
@ezhulenev Can you tell me the reason why this PR was reverted in 0507dba? It would be great to get feedback so I can fix it. |
@lgeiger it failed in https://gist.github.com/ezhulenev/0b8ab39e1533933197a61f23675e719b
Sorry didn't have time to dig into it myself, had to rollback to unblock other teams. Probably something simple. |
This PR adds support for max pooling operations to the algorithmic optimizer for monotonic functions. This follows up on tensorflow#25330 /cc @ezhulenev
No worries, thanks for the link. I'll take a look. |
This PR generalizes the MinMax monotonic optimizer to support
SegmentMax
,UnsortedSegmentMax
andArgMax
operations.This will improve performance of such operations after monotonic function.