Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement a bilinear initializer for transposed convolution to do upsampling. #11404

Merged
merged 3 commits into from
Jun 15, 2018

Conversation

qingqing01
Copy link
Contributor

Fix #11403

@qingqing01 qingqing01 changed the title Implement a bilinear initializer for transposed convolution. Implement a bilinear initializer for transposed convolution to do upsampling. Jun 12, 2018
'init_on_cpu', 'ConstantInitializer', 'UniformInitializer',
'NormalInitializer', 'XavierInitializer'
'NormalInitializer', 'XavierInitializer', 'BilinearInitializer'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里的排版用改成一列么?


Raises:
ValueError: If type of `var` and `block` is not right.
If the shape of `var` size is not 4 and
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and -> or ?

class BilinearInitializer(Initializer):
"""Implements the bilinear initializer.

This initializer can be used in transposed convolution operator to
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BilinearInitializer强绑定到“用conv_transpose做upsampling”的场景了?
如果我想对普通的parameter做bilinear initializatoin怎么办?

@qingqing01 qingqing01 merged commit 566a940 into PaddlePaddle:develop Jun 15, 2018
@qingqing01 qingqing01 deleted the bilinear_init branch November 14, 2019 05:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants