-
Notifications
You must be signed in to change notification settings - Fork 526
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
using Python 3.7.6 and PyTorch 1.4.0 loss is always NaN #2
Comments
My environment: Python 3.6.9 (default, Nov 7 2019, 10:44:02)
>>> torch.__version__
'1.2.0' |
I will close for now, feel free to reopen |
I also met the same problem in one server ( |
I added some fix 4 days ago to improve the stability. Can you try again now? |
@HobbitLong Thank you for your kindness. The problem vanished several days ago even though I do not know why. When I using the newest version of your code, it also runs well. By the way, I think for the newest version, |
Ah, nice catch! Just pushed a fix. Thanks for spotting this! |
To me, the problem lies in: exp_logits = torch.exp(logits) * logits_mask
log_prob = logits - torch.log(exp_logits.sum(1, keepdim=True)) my |
Thanks man you saved my day! |
One method to avoid exp_logits becoming 0 or having loss = Nan is to normalize your feature vectors before passing it the loss function. Hope it helps you guys! |
Haven't tested on Python 2
The text was updated successfully, but these errors were encountered: