-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The hyper parameters of paddle.optimizer does not work in v2 API. #2042
Comments
This bug is hard to fix. Because we split model configuration into two parts, the topology configuration, and the optimizer settings. When we config and parse topology, there is no optimizer information we set. However, the Here is a step by step solution to this issue.
The second step is a little bit hard to implement, may change the C++ Core of Paddle. |
不仅仅是 |
The solution might be as following. The global And then we can get the global |
This problem has been fixed by this PR #2288. |
This problem is not solved yet, so I reopen it. |
close since the v2 API has fixed this issue. |
The hyper parameters in the paddle.optimizer does not work in v2 API. For example, using momentum optimizer in the sentiment demo as follows,
Then print the
proto-string
of config before this line inpython/paddle/v2/trainer.py
, it can be found that theproto-string
of parameters does not contain the hyper parameters, such as L2 regularization and momentum. The momentum is 0 if you print it before this line inpaddle/parameter/FirstOrderOptimizer.h
. Theproto-string
of parameters are as follows,But the correct
proto-string
of parameters should contain decay_rate and momentum, as follows,The text was updated successfully, but these errors were encountered: