Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

4.2.2 未针对PyTorch修改 #40

Closed
zooltd opened this issue Oct 21, 2019 · 2 comments
Closed

4.2.2 未针对PyTorch修改 #40

zooltd opened this issue Oct 21, 2019 · 2 comments

Comments

@zooltd
Copy link
Contributor

zooltd commented Oct 21, 2019

No description provided.

@zooltd
Copy link
Contributor Author

zooltd commented Oct 21, 2019

原文:
"如果只想对某个特定参数进行初始化,我们可以调用Parameter类的initialize函数,它与Block类提供的initialize函数的使用方法一致。"
可修改为:
如果只想对某个特定参数进行初始化,除使用torch.nn.init库中提供的参数初始化函数外,我们还可以调用Tensor类的填充函数(如normal_(...),random_(...)等)。

@zooltd zooltd changed the title 4.2.2 MXNet 4.2.2 为针对PyTorch修改 Oct 21, 2019
@zooltd zooltd changed the title 4.2.2 为针对PyTorch修改 4.2.2 未针对PyTorch修改 Oct 21, 2019
ShusenTang added a commit that referenced this issue Oct 24, 2019
@ShusenTang
Copy link
Owner

原文:
"如果只想对某个特定参数进行初始化,我们可以调用Parameter类的initialize函数,它与Block类提供的initialize函数的使用方法一致。"
可修改为:
如果只想对某个特定参数进行初始化,除使用torch.nn.init库中提供的参数初始化函数外,我们还可以调用Tensor类的填充函数(如normal_(...),random_(...)等)。

4.2.3节讲了如何自定义初始化方法,所以我直接把这段话删了

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants