Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update the demo code and the doc of varbase.backward. #26506

Merged
merged 6 commits into from
Aug 27, 2020

Conversation

wzzju
Copy link
Contributor

@wzzju wzzju commented Aug 20, 2020

PR types

Others

PR changes

Docs

Describe

Update the demo code and the doc of varbase.backward. The pr of FluidDoc about backward.

Variable.backward

image

paddle.grad

180 76 141 178_documentation_docs_en_api_dygraph_grad html

@paddle-bot-old
Copy link

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@wzzju wzzju force-pushed the fix_backward_doc branch 2 times, most recently from 7a5e807 to 2990d29 Compare August 21, 2020 07:15
@@ -513,28 +513,30 @@ void BindImperative(py::module *m_ptr) {
Attribute:
**sort_sum_gradient**:

If framework will sum the gradient by the reverse order of trace. eg. x_var ( :ref:`api_guide_Variable` ) will be the input of multiple OP such as :ref:`api_fluid_layers_scale` , this attr will decide if framework will sum gradient of `x_var` by the reverse order.
If framework will sum gradients by the reverse order of the forward execution sequence.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"If True, framework will ..."

Otherwise this sentence doesn't make sense.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

@@ -513,28 +513,30 @@ void BindImperative(py::module *m_ptr) {
Attribute:
**sort_sum_gradient**:

If framework will sum the gradient by the reverse order of trace. eg. x_var ( :ref:`api_guide_Variable` ) will be the input of multiple OP such as :ref:`api_fluid_layers_scale` , this attr will decide if framework will sum gradient of `x_var` by the reverse order.
If framework will sum gradients by the reverse order of the forward execution sequence.
For instance, x_var ( :ref:`api_guide_Variable` ) will be the input of multiple OP such as :ref:`api_fluid_layers_scale` ,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"For instane, when/if x_var ( :ref:api_guide_Variable ) is the input of multiple OPs such as :ref:api_fluid_layers_scale, whether the framework sums the gradient of x_var by the reverse order is determined by this attr".

There are two grammar knowledges:

  1. When we say conditional things, or things happen commonly, we don't use future tense. For example: "if x_var is XXX". no need to "if x_var will be XXX".

  2. "this attr will decide XXX" is fine to me but I feel my writing is better.

Copy link
Contributor Author

@wzzju wzzju Aug 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. Done.

# x_var will be multi-scales' input here
for _ in range(10):
sums_inputs.append(paddle.scale(x_var))
ret2 = paddle.sums(sums_inputs)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is ret2, loss2 without ret1, loss1? Can we just write ret, loss?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

backward_strategy(BackwardStrategy, optional): The backward strategy to run the backward pass. If you would
like to sum gradients by the reverse order of the forward execution sequence, please set the value of
:code:`BackwardStrategy.sort_sum_gradient` to True. Refer to :ref:`api_fluid_dygraph_BackwardStrategy` for
more details. Defaults to None, which meant that the value of :code:`BackwardStrategy.sort_sum_gradient` is False.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"which meant" -> "which means"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

backward_strategy = fluid.dygraph.BackwardStrategy()
backward_strategy.sort_sum_gradient = True
loss2.backward(backward_strategy)
inputs2 = []
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same to above comment, why trailing 2 at these variables.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

backward_strategy(BackwardStrategy, optional): The backward strategy to run the backward pass. If you would
like to sum gradients by the reverse order of the forward execution sequence, please set the value of
:code:`BackwardStrategy.sort_sum_gradient` to True. Refer to :ref:`api_fluid_dygraph_BackwardStrategy` for
more details. Defaults to None, which meant that the value of :code:`BackwardStrategy.sort_sum_gradient` is False.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"which meant" -> "which means"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

backward_strategy = fluid.dygraph.BackwardStrategy()
backward_strategy.sort_sum_gradient = True
loss2.backward(backward_strategy)
inputs2 = []
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same to above comment, why trailing 2 at these variables.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

backward_strategy = fluid.dygraph.BackwardStrategy()
backward_strategy.sort_sum_gradient = True
loss2.backward(backward_strategy)
x_var = paddle.to_variable(x)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

paddle.to_variable->paddle.to_tensor

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

@@ -134,38 +134,41 @@ def backward(self, backward_strategy=None, retain_graph=False):
**Notes**:
**This API is ONLY available in Dygraph mode**

Run backward of current Graph which starts from current Variable
Run backward of current Graph which starts from current Variable.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Variable->Tensor

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

loss2.backward(backward_strategy)
inputs2 = []
for _ in range(10):
tmp = paddle.to_variable(x)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

paddle.to_variable->paddle.to_tensor

Copy link
Contributor Author

@wzzju wzzju Aug 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

TCChenlong
TCChenlong previously approved these changes Aug 21, 2020
Copy link
Contributor

@TCChenlong TCChenlong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@XiaoguangHu01 XiaoguangHu01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@phlrain phlrain self-requested a review August 27, 2020 03:44
Copy link
Contributor

@TCChenlong TCChenlong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Collaborator

@raindrops2sea raindrops2sea left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@wzzju wzzju merged commit f9066e6 into PaddlePaddle:develop Aug 27, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants