Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix v2 optimizer define order #4972

Merged
merged 3 commits into from
Oct 23, 2017

Conversation

typhoonzero
Copy link
Contributor

@typhoonzero typhoonzero commented Oct 20, 2017

Fix #2621
Fix #2563

parameter.initial_smart = cp.g_default_initial_smart
if parameter.num_batches_regularization == 1 and cp.g_default_num_batches_regularization:
parameter.num_batches_regularization = cp.g_default_num_batches_regularization
if parameter.gradient_clipping_threshold == 0.0 and cp.g_default_gradient_clipping_threshold:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

line excess 80.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

# are defined after layers, or between layers.
# Must be called from trainer.__init__()
for parameter in self.__model_config__.parameters:
print "####", parameter.decay_rate, cp.g_default_decay_rate
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this print necessary? It seems that this is for debug print?

CHECK_LE((row + 1) * width_ * sizeof(real), preallocatedBuf_->getSize());
// CHECK_LE((row + 1) * width_ * sizeof(real),
// preallocatedBuf_->getSize());
CHECK_LE((row)*width_ * sizeof(real), preallocatedBuf_->getSize());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure what are these lines for? are these lines accidentally added?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, that's for my local testing, will revert this file.

Copy link
Contributor

@lcy-seso lcy-seso left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thank you very much.

@typhoonzero typhoonzero merged commit 154e1d0 into PaddlePaddle:develop Oct 23, 2017
@typhoonzero typhoonzero deleted the fix_v2_optimizer_order branch December 22, 2017 05:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants