Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Leaky ReLU in ResNet #43

Closed
LeeDoYup opened this issue Oct 2, 2020 · 2 comments
Closed

Leaky ReLU in ResNet #43

LeeDoYup opened this issue Oct 2, 2020 · 2 comments

Comments

@LeeDoYup
Copy link

LeeDoYup commented Oct 2, 2020

Standard ResNets are known to use ReLU activation function,
but i found that your implementation uses Leaky ReLU instead of ReLU.

Does replacing ReLU into Leaky ReLU affect the results?

@kihyuks
Copy link

kihyuks commented Nov 5, 2020

Yes, you are right that ReLU is more conventional for ResNet.
Our preliminary results show that the performance does not change much with a specific selection of activation between ReLU and Leaky ReLU, but this is not based on the comprehensive study. You can easily convert Leaky ReLU into ReLU to conduct additional experiments.

@LeeDoYup
Copy link
Author

LeeDoYup commented Nov 6, 2020

Thank you for reply !
I conducted the ablation experiment on ReLu/LeakyReLU in my pytorch implementation (in my repo),
and I checked there is no changed. Thank you !

@LeeDoYup LeeDoYup closed this as completed Nov 6, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants