Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

new backward #6741

Merged
merged 20 commits into from
Dec 27, 2017
Merged

new backward #6741

merged 20 commits into from
Dec 27, 2017

Conversation

JiayiFeng
Copy link
Collaborator

@JiayiFeng JiayiFeng commented Dec 19, 2017

fixes #6600

return name + core.grad_var_suffix()


def _append_backward_ops_(target,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function is so long to read.

Copy link
Collaborator Author

@JiayiFeng JiayiFeng Dec 26, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. It's only a draft. I'm refactoring it.

Copy link
Collaborator

@reyoung reyoung left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, 可以先merge一波。。

@JiayiFeng JiayiFeng merged commit b775b6c into PaddlePaddle:develop Dec 27, 2017
@JiayiFeng JiayiFeng changed the title [WIP] new backward new backward Dec 28, 2017
@emailweixu
Copy link
Collaborator

Does this mean that the previous append_backward implemented in C++ is no longer needed?
A related question is that if later we decide to have a different language binding, does this logic have to be re-implemented again in a different language?

@JiayiFeng JiayiFeng deleted the dev_new_backward branch January 30, 2018 05:59
@JiayiFeng
Copy link
Collaborator Author

JiayiFeng commented Jan 30, 2018

Sorry for the delay in reply.

Yes, the code in C++ is no longer needed.
Yes, this logic has to be re-implemented if we decide to have another language binding.

The purpose of this refactor is making backward supporting callback functions, which is required by error clip and gradient reduce in distributed training.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Problems in current backward implementation
4 participants