Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding the interface for the momentum optimizer #4919

Merged
merged 4 commits into from
Oct 20, 2017

Conversation

abhinavarora
Copy link
Contributor

fixes #4898

Args:
block: the block in which the loss variable is present
"""
pass
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

raise NotImplementedError()

Copy link
Member

@jacquesqiao jacquesqiao Oct 19, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just keep using pass, because sometimes user did not need to impliment this method~

pass
# Dictionary of accumulators.
# {accum_name : { paramter_name : accumulator_for_parameter, ...}, ...}
self._accumulators = defaultdict(lambda: dict())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should add some comment about what is accumulator

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in latest commit

"""
create and add gradient Operators in BlockDesc to Compute gradients of `loss`
for parameters in parameter_list
pass
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

raise NotImplementedError()

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same as above

class MomentumOptimizer(Optimizer):
"""Simple Momentum optimizer with velocity state
"""
_velocity_acc_str = "velocity"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not familiar with this usage of Python, should we put this value into

def __init__():
    self. _velocity_acc_str = "velocity"

not sure which is better

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This way _velocity_str is a like a static member of the class MomentumOptimizer. However, on doing self. _velocity_acc_str = "velocity", it will become an instance member. Since this is a class-wide constant, I preferred to make it a static constant.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

Copy link
Member

@jacquesqiao jacquesqiao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great job, LGTM!

@abhinavarora abhinavarora merged commit 0e31d7d into PaddlePaddle:develop Oct 20, 2017
@abhinavarora abhinavarora deleted the momentum_python branch October 20, 2017 04:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Write Momentum Op Python Wrapper
2 participants