Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

the loss is nan #14

Open
cena001plus opened this issue Jun 15, 2021 · 4 comments
Open

the loss is nan #14

cena001plus opened this issue Jun 15, 2021 · 4 comments

Comments

@cena001plus
Copy link

i have try different lr from 0.00001 to 0.1 , but thr loss is nan , why?
image

@LuoXubo
Copy link

LuoXubo commented Aug 16, 2021

same problem, have you solved it?

@GHGluck
Copy link

GHGluck commented Mar 31, 2022

I also had the same problem.
1648638909(1)

@hli1221
Copy link
Owner

hli1221 commented Apr 4, 2022

Hi, thank you very much for report this problem.
Normally, the pixel loss should be decreased with increasing iteration number. However, the pixel loss was increasing. I suggest you can check the input and the output, this problem may occur if they are not in the same domain.

@GHGluck
Copy link

GHGluck commented Apr 5, 2022

Thank you for your reply. I tried to modify the input data and found that sometimes the training can be successful, but when I increase the data, the loss will still be none. I don't know if it is because the network cannot fit my training data.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants