We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Summary: We can improve the numerical stability/accuracy of the calc_loss method.
calc_loss
The current implementation uses the following:
def calc_loss(y, t): y = torch.nn.functional.log_softmax(y) loss = torch.nn.functional.nll_loss( y, t, weight=None, reduction='mean') return loss
PyTorch includes a single functional that is numerically more stable cross_entropy. It would also simplify the above code to:
cross_entropy
def calc_loss(y, t): loss = torch.nn.functional.cross_entropy(y, t, weight=None, reduction="mean") return loss
The text was updated successfully, but these errors were encountered:
Fix nimarb#23: Improve calc_loss's numerical stability using cross en…
8c7d277
…tropy.
I thought cross_entropy just combines log_softmax and nll_loss.
Sorry, something went wrong.
Successfully merging a pull request may close this issue.
Summary: We can improve the numerical stability/accuracy of the
calc_loss
method.The current implementation uses the following:
PyTorch includes a single functional that is numerically more stable
cross_entropy
. It would also simplify the above code to:The text was updated successfully, but these errors were encountered: