-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
modify doc for paddle.nn.Layer #27624
modify doc for paddle.nn.Layer #27624
Conversation
Thanks for your contribution! |
@@ -62,10 +62,6 @@ def remove(self): | |||
|
|||
class Layer(core.Layer): | |||
""" | |||
:alias_main: paddle.nn.Layer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
为啥要移除这个alias
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个是2.0API文档中要求的。说会有自动生成工具。这里让删掉。
print(layer_list) | ||
layer_list = list(model.children()) | ||
|
||
print(layer_list) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
最好有一个注释的说明
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
model = fluid.dygraph.Sequential(fc1, fc2) | ||
|
||
layer_list = list(model.children()) | ||
fc1 = paddle.nn.Linear(10, 3) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
变量名称最好叫 linear1 毕竟后面的类名是Linear
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
'alpha': 1, | ||
} | ||
tmp = self.create_variable(name = "linear_tmp_0", dtype=self._dtype) | ||
paddle.fluid.default_main_program().current_block().append_op( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
示例中不要使用append op,对于普通用户来说,他们不需要了解这种用法
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
已整体修改示例。
self.weight = self.create_parameter( | ||
shape=[in_features, out_features], | ||
is_bias=False) | ||
self.bias = self.create_parameter( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
forward中没有使用 bias,bias 可以不写
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
已整体修改示例。
|
||
Example:: | ||
.. code-block:: python | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个train 可以和下面的eval用同样的MyLayer
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
inputs = {'X': [input], 'Y': [self.weight]} | ||
attrs = { | ||
'transpose_X': False, | ||
'transpose_Y': False, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个forward里面可以直接写 paddle.matmul( input, self.self.weight), 不用append op
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
已整体修改示例。
|
||
import paddle | ||
|
||
linear = paddle.nn.Linear(1, 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个 add_parameter 建议卸载一个class里面,不建议直接放在class 外面用
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
state_dict = emb.state_dict() | ||
fluid.save_dygraph( state_dict, "paddle_dy") | ||
state_dict = emb.state_dict() | ||
paddle.save( state_dict, "paddle_dy") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"paddle_dy" 建议改成 "paddle_dy.pdparams"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
def forward(self, input): | ||
return self._linear(input) | ||
|
||
x = paddle.randn([10, 1], 'float32') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这行好像没有用
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@@ -77,13 +73,13 @@ class Layer(core.Layer): | |||
dtype(str or core.VarDesc.VarType, optional): data type of this parameter. | |||
If set str, it can be "bool", "float16", "float32", "float64", | |||
"int8", "int16", "int32", "int64", "uint8" or "uint16". | |||
Default: ``core.VarDesc.VarType.FP32`` | |||
Default: "float32" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
73 行,不需要暴露core.VarDesc.Vartype
给用户。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
|
||
paddle.disable_static() | ||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
201行
用paddle.full
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
@@ -305,6 +361,26 @@ def create_parameter(self, | |||
|
|||
Returns: | |||
:ref:`api_guide_Variable_en` : created parameter. | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- 353行::ref:
api_fluid_ParamAttr
现在是:ref:api_paddle_ParamAttr
- 354行: core.VarDesc.VarType 不需要暴露, str写了两次
- 359行::ref:
api_fluid_initializer_XavierInitializer
and :ref:api_fluid_initializer_ConstantInitializer
更新成paddle.nn.initializer下的API
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
@@ -326,11 +402,32 @@ def create_variable(self, | |||
dtype(str or core.VarDesc.VarType, optional): data type of this parameter. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- core.VarDesc.VarType不暴露给用户
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
@@ -326,11 +402,32 @@ def create_variable(self, | |||
dtype(str or core.VarDesc.VarType, optional): data type of this parameter. | |||
If set str, it can be "bool", "float16", "float32", "float64", | |||
"int8", "int16", "int32", "int64", "uint8" or "uint16". | |||
If set None, it will be ``core.VarDesc.VarType.FP32``. Default: None | |||
If set None, it will be "float32". Default: None | |||
type(core.VarDesc.VarType, optional): type of the variable. No need to set this parameter. Default: ``core.VarDesc.VarType.LOD_TENSOR`` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个参数可以直接从参数列表中去掉吧?
2.0里暂时没有LODTensor
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done.和红雨商议后,将接口部分的改参数做了删除
@@ -349,6 +446,15 @@ def parameters(self, include_sublayers=True): | |||
|
|||
Returns: | |||
list of :ref:`api_guide_Variable_en` : a list of Parameters. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
list of :ref:api_guide_Variable_en
:
这个建议先删掉。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
@@ -536,16 +660,15 @@ def register_buffer(self, name, variable, persistable=True): | |||
.. code-block:: python | |||
|
|||
import numpy as np | |||
import paddle.fluid as fluid | |||
import paddle |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
641行,应该是non-trainable parameters?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done。同时将所有Variable改成Tensor
@@ -585,6 +708,20 @@ def buffers(self, include_sublayers=True): | |||
|
|||
Returns: | |||
list of :ref:`api_guide_Variable_en` : a list of buffers. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
list of :ref:api_guide_Variable_en
: 建议去掉
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
@@ -609,25 +746,24 @@ def named_buffers(self, prefix='', include_sublayers=True): | |||
.. code-block:: python | |||
|
|||
import numpy as np | |||
import paddle.fluid as fluid | |||
import paddle |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
named_buffer
这个API文档里的variable,应该称为tensor 还是 buffer?variable有些不合适。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done。已将所有Variable改成Tensor
parameters=linear.parameters()) | ||
out = linear(a) | ||
out.backward() | ||
adam.minimize(out) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
现在推荐用adam.step()
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done,thx!
c80b84a
@@ -198,7 +198,7 @@ def apply(self, fn): | |||
def init_weights(layer): | |||
if type(layer) == nn.Linear: | |||
print('before init weight:', layer.weight.numpy()) | |||
new_weight = paddle.fill_constant(layer.weight.shape, layer.weight.dtype, value=0.9) | |||
new_weight = paddle.fill(layer.weight.shape, layer.weight.dtype, value=0.9) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
paddle里只有paddle.full
没有paddle.fill
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
抱歉,已修改
If set str, it can be "bool", "float16", "float32", "float64", | ||
"int8", "int16", "int32", "int64", "uint8" or "uint16". Default: "float32". | ||
is_bias(bool, optional): if this is a bias parameter. Default: False. | ||
default_initializer(Initializer, optional): the default initializer for this parameter. | ||
If set None, default initializer will be set to :ref:`api_fluid_initializer_XavierInitializer` and :ref:`api_fluid_initializer_ConstantInitializer` | ||
If set None, default initializer will be set to :ref:`_api_paddle_fluid_initializer_Xavier` and :ref:`_api_paddle_fluid_initializer_Constant` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
没改对啊。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
经沟通,暂时使用
paddle.nn.initializer.Constant
paddle.nn.initializer.Xavier
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Others
PR changes
Docs
Describe
根据2.0API要求,修改paddle.nn.Layer代码示例
为没有代码示例的函数添加代码示例。forward、backward无实际意义,没有添加代码示例
将构造函数中dtype的默认值改成”float32“
预览图: