Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix demo fit_a_line #410

Merged
merged 1 commit into from
Oct 18, 2017
Merged

fix demo fit_a_line #410

merged 1 commit into from
Oct 18, 2017

Conversation

putcn
Copy link

@putcn putcn commented Oct 18, 2017

fixing #409
with original fc config, the default learning rate will lead to gradient explosion. fixed by adding learning_rate=1e-3

Copy link
Collaborator

@helinwang helinwang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGMT++

@helinwang helinwang merged commit f92c7ac into PaddlePaddle:develop Oct 18, 2017
@putcn putcn deleted the fix_fit_a_line branch October 18, 2017 23:32
input=x,
size=1,
act=paddle.activation.Linear(),
param_attr=paddle.attr.Param(learning_rate=1e-3))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

bias_attr also need to set. Anyway, this is due to the incorrect convertion from old optimizer configuration to new optimizer configuration of paddle.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

got it, thanks @typhoonzero. will check and update later.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is due to the incorrect convertion from old optimizer configuration to new optimizer configuration of paddle.

@typhoonzero Can you create a issue with a link indicating the line number which have the problem? This could provide better documentation so that in case in the future we run into it again, we can look at the issue and know what happened. Thanks!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants