Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: improve scipy.special.log_softmax accuracy in edge cases by a factor of 2**126 to 2**1022 #19549

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Commits on Jan 19, 2024

  1. ENH: improve scipy.special.log_softmax accuracy

    By taking advantage of the fact that `x - x_max` is going to be 0 at the
    maximum and that `exp(0)` is 1, we can use `log1p` instead of `log` to
    increase the accuracy of `log_softmax` at the maximum index by a factor
    of about `2**126` (for float32) or about `2**1022` (for float64).
    
    Fixes scipy#19521
    JasonGross committed Jan 19, 2024
    Configuration menu
    Copy the full SHA
    e573855 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    1c0dca0 View commit details
    Browse the repository at this point in the history