Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【PaddlePaddle Hackathon 2】16、doc for rrelu #4725

Merged
merged 7 commits into from May 31, 2022
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 2 additions & 0 deletions docs/api/paddle/nn/Overview_cn.rst
Expand Up @@ -127,6 +127,7 @@ Padding层
" :ref:`paddle.nn.LogSoftmax <cn_api_nn_LogSoftmax>` ", "LogSoftmax激活层"
" :ref:`paddle.nn.Maxout <cn_api_nn_Maxout>` ", "Maxout激活层"
" :ref:`paddle.nn.PReLU <cn_api_nn_PReLU>` ", "PReLU激活层"
" :ref:`paddle.nn.RReLU <cn_api_nn_RReLU>` ", "RReLU激活层"
" :ref:`paddle.nn.ReLU <cn_api_nn_ReLU>` ", "ReLU激活层"
" :ref:`paddle.nn.ReLU6 <cn_api_nn_ReLU6>` ", "ReLU6激活层"
" :ref:`paddle.nn.SELU <cn_api_nn_SELU>` ", "SELU激活层"
Expand Down Expand Up @@ -381,6 +382,7 @@ Padding相关函数
" :ref:`paddle.nn.functional.log_softmax <cn_api_nn_cn_log_softmax>` ", "log_softmax激活函数"
" :ref:`paddle.nn.functional.maxout <cn_api_nn_cn_maxout>` ", "maxout激活函数"
" :ref:`paddle.nn.functional.prelu <cn_api_nn_cn_prelu>` ", "prelu激活函数"
" :ref:`paddle.nn.functional.rrelu <cn_api_nn_cn_rrelu>` ", "rrelu激活函数"
" :ref:`paddle.nn.functional.relu <cn_api_nn_cn_relu>` ", "relu激活函数"
" :ref:`paddle.nn.functional.relu_ <cn_api_nn_cn_relu_>` ", "Inplace 版本的 :ref:`cn_api_nn_cn_relu` API,对输入 x 采用 Inplace 策略"
" :ref:`paddle.nn.functional.relu6 <cn_api_nn_cn_relu6>` ", "relu6激活函数"
Expand Down
51 changes: 51 additions & 0 deletions docs/api/paddle/nn/RReLU_cn.rst
@@ -0,0 +1,51 @@
.. _cn_api_nn_RReLU:

RReLU
-------------------------------
.. py:class:: paddle.nn.RReLU(lower=1./8., upper=1./3., name=None)

RReLU激活层,应用随机纠正线性单元对神经元激活,参考论文:
`Empirical Evaluation of Rectified Activations in Convolutional Network <https://arxiv.org/abs/1505.00853>`_ 。

训练阶段对负斜率进行均匀分布随机采样:

.. math::

rrelu(x)=
\left\{
\begin{array}{rcl}
x, & & if \ x >= 0 \\
a * x, & & otherwise \\
\end{array}
\right.

其中,:math:`x` 为输入的 Tensor,:math:`a` 是服从(:math:`lower`,:math:`upper` )均匀分布的随机值。

测试阶段负斜率取均匀分布上下边界(:math:`lower` 及 :math:`upper` )的平均值:

.. math::

rrelu(x)=
\left\{
\begin{array}{rcl}
x, & & if \ x >= 0 \\
(lower + upper) * 0.5 * x, & & otherwise \\
\end{array}
\right.

其中,:math:`x` 为输入的 Tensor,:math:`lower` 及 :math:`upper` 是随机均匀分布的上下边界。

参数
::::::::::
- lower (float,可选) - 负值斜率的随机值范围下限,`lower` 包含在范围中。支持的数据类型:float。默认值为0.125。
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

参数需要加粗,参考:

  • lower (float,可选) - 负值斜率的随机值范围下限,lower 包含在范围中。支持的数据类型:float。默认值为0.125。

- upper (float,可选) - 负值斜率的随机值范围上限,`upper` 包含在范围中。支持的数据类型:float。默认值为0.333。
- name (str,可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。

形状
::::::::::
- **x** (Tensor) – 任意形状的Tensor,默认数据类型为float32。
- **out** (Tensor) – 和x具有相同形状的Tensor。

代码示例
:::::::::
COPY-FROM: paddle.nn.RRelu:RRelu-example
53 changes: 53 additions & 0 deletions docs/api/paddle/nn/functional/rrelu_cn.rst
@@ -0,0 +1,53 @@
.. _cn_api_nn_cn_prelu:

rrelu
-------------------------------

.. py:function:: paddle.nn.functional.rrelu(x, lower=1. / 8., upper=1. / 3., training=True, name=None)

rrelu激活函数,应用随机纠正线性单元对神经元激活,参考论文:
`Empirical Evaluation of Rectified Activations in Convolutional Network <https://arxiv.org/abs/1505.00853>`_ 。

训练阶段对负斜率进行均匀分布随机采样:

.. math::

rrelu(x)=
\left\{
\begin{array}{rcl}
x, & & if \ x >= 0 \\
a * x, & & otherwise \\
\end{array}
\right.

其中,:math:`x` 为输入的 Tensor,:math:`a` 是服从(:math:`lower`,:math:`upper` )均匀分布的随机值。

测试阶段负斜率取均匀分布上下边界(:math:`lower` 及 :math:`upper` )的平均值:

.. math::

rrelu(x)=
\left\{
\begin{array}{rcl}
x, & & if \ x >= 0 \\
(lower + upper) * 0.5 * x, & & otherwise \\
\end{array}
\right.

其中,:math:`x` 为输入的 Tensor,:math:`lower` 及 :math:`upper` 是随机均匀分布的上下边界。

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

中英文需要保持一致,英文部分有附参考论文

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

完成

参数
::::::::::
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

参数缺少x的介绍?源码中是有5个参数的,可以参考:https://github.com/PaddlePaddle/docs/blob/develop/docs/templates/common_docs.py#L40

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

完成

- x (Tensor) - 输入的 `Tensor` ,数据类型为:float16、float32、float64。
- lower (float,可选) - 负值斜率的随机值范围下限,`lower` 包含在范围中。支持的数据类型:float。默认值为0.125。
- upper (float,可选) - 负值斜率的随机值范围上限,`upper` 包含在范围中。支持的数据类型:float。默认值为0.333。
- training (bool,可选) - 标记是否为训练阶段。 默认: True。
- name (str,可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name` 。

返回
::::::::::
``Tensor`` ,数据类型和形状同 ``x`` 一致。

代码示例
:::::::::
COPY-FROM: paddle.nn.functional.rrelu:rrelu-example