-
Notifications
You must be signed in to change notification settings - Fork 7.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
学习率如何调整?在yml文件中设置没有改变,一直都是0.0005 #72
Comments
adam是自适应优化算法,学习率是内部调整的,上面显示的只是基础学习率。SGD+动量的修改方式可以参考如下: def PiecewiseDecay(params): (2)然后在yml文件中修改优化器配置, |
adam基础学习率没法调整吗?不管怎么调整log中都是0.0005 |
parameter_list,初始化的传入的是None? |
… support hybrid CPU and NPU (PaddlePaddle#72) test=develop
如果说是动态调整,跑了一天一夜,LR还是0.0005,没看到有变化的log
The text was updated successfully, but these errors were encountered: