New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
loss explode question #3
Comments
Hello, sure. Try decrease LR scheduler gamma a bit in the first place. From 0.95 to 0.9 with default initial LR (1e-3). I used 0.95 and it was kinda borderline between normal training and training with loss explosion. And please report if it helps or not. |
Thank you for your advice, i will try and report it! |
Any updates, @junkeon ? |
As your advice, i decreased gamma varies from 0.9 to 0.5 and there are all loss explode |
Thank you for your feedback. And what was your batch size? |
sorry for late the previous feedback... |
No-no, thank you very much, its enough information. I'm just trying to figure out why I had no loss explode problems with default setting on my server. However, I am glad that you managed to finish training, finally. Since the issue is solved, I'm closing it. |
Hello, Thanks for your work. it is very helpful to me.
I have a question about loss exploding
I tried to train ljspeech data which you used with default setting (lr = 1e-3) and i had NaN loss issue
So i reduced lr to 5e-4 then there is no NaN loss isuue but loss exploding (normal loss : < 0.07, exploded loss : > 732M)
I know there are codes for prevent loss exploding like lr schedule, clipping however it is not working
Can you help me?
The text was updated successfully, but these errors were encountered: