Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ParametricRelU after pixelshuffler #57

Closed
mahmed0509 opened this issue Mar 20, 2019 · 2 comments
Closed

ParametricRelU after pixelshuffler #57

mahmed0509 opened this issue Mar 20, 2019 · 2 comments

Comments

@mahmed0509
Copy link

Hello,
In Generator, I am trying to understand why do we need parametric ReLU functions after each pixelshuffler layer? And what is the use of final Conv layer in generator? Can you please explain?

@brade31919
Copy link
Owner

Hi @mahmed0509 ,

This repo is just for re-implementation (since there's no official release from the author). I need to follow the setting from the paper.
For PReLU, I think the reason is just it's a more powerful activation function. I don't think you will get much different results by replacing it with ReLU or LeakyReLU (but LeakyReLU is more suitable for network with batchnorm).

For last convolution, I think it's just for refinement. You can remove it and you will find that you make the optimization process more difficult...

Best,
brade31919

@mahmed0509
Copy link
Author

Thanks a lot @brade31919 for the explanation. That was really helpful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants