New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add algorithmic optimizer to convert Log(Softmax(x)) to LogSoftmax(x) #25455
Add algorithmic optimizer to convert Log(Softmax(x)) to LogSoftmax(x) #25455
Conversation
265b477
to
d8797a5
Compare
@rmlarsen Could you PTAL and approve or suggest changes(if required). |
d8797a5
to
8fb50c8
Compare
This PR adds an algorithmic optimizer which converts `Log(Softmax(x))` to `LogSoftmax(x)`. [`LogSoftmax`](https://www.tensorflow.org/api_docs/cc/class/tensorflow/ops/log-softmax) is numerically more stable and may be a bit faster in some cases. This could be expanded in the future to also optimize `log(softmax(x) * y) = logsoftmax(x) + log(y)` and `log(softmax(x) / y) = logsoftmax(x) - log(y)`.
8fb50c8
to
597f02c
Compare
Rebased since #25300 introduced some merge conflicts. |
@hgadig Any updates on this? |
Waiting for the review. Also, this is assigned to rthadur and he will be taking care of this PR(fyi). Thanks ! |
Thanks! Sorry for long review cycle. |
PiperOrigin-RevId: 237148047
Thanks @ezhulenev, you've been really helpful with getting my fixes merged. |
This PR adds an algorithmic optimizer which converts
Log(Softmax(x))
toLogSoftmax(x)
.LogSoftmax
is numerically more stable and may be a bit faster in some cases.This could be expanded in the future to also optimize
log(softmax(x) * y) = logsoftmax(x) + log(y)
andlog(softmax(x) / y) = logsoftmax(x) - log(y)
.