Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gettin NaN after few iterations #8

Closed
bernardohenz opened this issue Dec 30, 2017 · 3 comments
Closed

Gettin NaN after few iterations #8

bernardohenz opened this issue Dec 30, 2017 · 3 comments

Comments

@bernardohenz
Copy link

Hello,

I am using your SSIM implementation as a part of my total objective for performing denoising. Unfortunately, after a few iterations, I started getting NaN in the objective (this does not happen if I remove the SSIM from the loss).

I am wondering if there is any occasion where your implementation may divide by zero, or something that may cause NaN. By just looking at the code, I don't know why this could happen (since C1 and C2 are used to avoid zero-divisions).

Any thoughs?

@Po-Hsun-Su
Copy link
Owner

Can you post a snippet that can reproduce the problem? The NaN objective could be caused by exploding gradient which may or may not come from ssim.
Discussion to debug NaN: https://discuss.pytorch.org/t/solved-debugging-nans-in-gradients/10532
Try to run latest pytorch. There could be bug in pytorch like this one pytorch/pytorch#2421.

@bernardohenz
Copy link
Author

Hello, I've solved it. The problem was that I was using several InstanceNormalization layers, and when the network got a patch of constant color (whose variance in pixels is close to 0), when normalizing (dividing by the std of the colors), it returned very big numbers (eventually exploding).

So, the problem was really due to some bad data in the training set. The hard part of debuging this is that this was not like a zero-division case, but rather operations returning exploding values.

@Po-Hsun-Su
Copy link
Owner

Cool! I close this issue since you solved it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants