Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loss can not convergence #32

Open
longpeng2008 opened this issue Mar 21, 2017 · 4 comments
Open

loss can not convergence #32

longpeng2008 opened this issue Mar 21, 2017 · 4 comments

Comments

@longpeng2008
Copy link

using the default everything, the loss can not converge at all.

@tfzhou
Copy link

tfzhou commented Oct 13, 2017

Also, the loss cannot converge. However, the results become better and better during training. So why?

@mhl20110088
Copy link

I think ,the signifcant reason is the loss layer of the network.So you try to modify it.

@chaobeiying
Copy link

How did you solve the problem of non-convergence? I met the same problem.Thank you.

@falreis
Copy link

falreis commented Nov 9, 2018

I'm facing the same problem. The function converges?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants