New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handle None gradients in nn.utils.clip_grad_norm #5650
Comments
What this says is |
Well, I understand that however, this seems like a problem with PyTorch as the people who have used the repo following the commands provided didn't have this error. Possibly a recent problem with PyTorch actually. |
The last commit on the page is from Jan 24, 2017. Pytorch definitely has changed a lot since then. If you have specific questions about how to use pytorch, please ask on our forums: https://discuss.pytorch.org/ |
closed via @zou3519 's comment. |
I think the error is still legitimate. We should handle None |
@apaszke I think we do handle None The traceback @monajalal posted implies that the code uses its own
@monajalal If you replace |
I get this error:
for train.py file in https://github.com/vanzytay/pytorch_sentiment_rnn I have followed all the steps in the readme up to here. What do you think should be fixed?
When submitting a bug report, please include the following information (where relevant):
The text was updated successfully, but these errors were encountered: