Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix paddle.nn.loss.L1Loss OP, add paddle.nn.functional.l1_loss OP for API2.0, test=develop #26040

Merged
merged 2 commits into from
Aug 12, 2020

Conversation

LutaoChu
Copy link
Contributor

@LutaoChu LutaoChu commented Aug 7, 2020

PR types

New features

PR changes

APIs

Describe

fix paddle.nn.loss.L1Loss OP, add paddle.nn.functional.l1_loss OP for API2.0

@paddle-bot-old
Copy link

paddle-bot-old bot commented Aug 7, 2020

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-bot-old
Copy link

paddle-bot-old bot commented Aug 7, 2020

✅ This PR's description meets the template requirements!
Please wait for other CI results.

This operator computes the L1 Loss of Tensor ``x`` and ``label`` as follows.

If :attr:`reduction` set to ``'none'``, the loss is:

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

是否需要多一行?none mean下面都空了一行

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done


if reduction == 'sum':
unreduced = paddle.elementwise_sub(x, label, act='abs')
return paddle.reduce_sum(unreduced, name=name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

建议在2.0中将paddel.reduce_sum 改成paddel.sum paddle.reduce_mean改成paddle.mean

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

elif reduction == 'mean':
unreduced = paddle.elementwise_sub(x, label, act='abs')
return paddle.reduce_mean(unreduced, name=name)
else:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

动态图是使用core.ops

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

output = l1_loss(input,label)
print(output.numpy())
"""
if reduction not in ['sum', 'mean', 'none']:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

最前面是动态图的代码,使用core.ops

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

@LutaoChu LutaoChu force-pushed the l1loss-op branch 5 times, most recently from 2c8fc55 to 23214ac Compare August 10, 2020 03:32
yaoxuefeng6
yaoxuefeng6 previously approved these changes Aug 10, 2020
Copy link
Contributor

@yaoxuefeng6 yaoxuefeng6 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

If :attr:`reduction` set to ``'sum'``, the reduced sum loss is:
Out = MEAN(\lvert x - label\rvert)

If :attr:`reduction` set to ``'sum'``, the loss is:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

此处下面公式,应空一行

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

@@ -279,63 +275,53 @@ class L1Loss(fluid.dygraph.Layer):
If :attr:`reduction` is ``'mean'``, the reduced mean loss is returned.
If :attr:`reduction` is ``'sum'``, the reduced sum loss is returned.
Default is ``'mean'``.
Returns:
A callable object of L1Loss.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

各个小节之间,目测也应空一行

Copy link
Contributor Author

@LutaoChu LutaoChu Aug 11, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

Copy link
Contributor

@wawltor wawltor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@yaoxuefeng6 yaoxuefeng6 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -279,63 +276,55 @@ class L1Loss(fluid.dygraph.Layer):
If :attr:`reduction` is ``'mean'``, the reduced mean loss is returned.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

【If :attr:reduction is 'mean'】->【If reduction is 'mean'

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

经查看,两者显示效果相同

Copy link
Contributor

@TCChenlong TCChenlong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@wawltor wawltor merged commit 1d870c4 into PaddlePaddle:develop Aug 12, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants