You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Yes, you are right that ReLU is more conventional for ResNet.
Our preliminary results show that the performance does not change much with a specific selection of activation between ReLU and Leaky ReLU, but this is not based on the comprehensive study. You can easily convert Leaky ReLU into ReLU to conduct additional experiments.
Thank you for reply !
I conducted the ablation experiment on ReLu/LeakyReLU in my pytorch implementation (in my repo),
and I checked there is no changed. Thank you !
Standard ResNets are known to use ReLU activation function,
but i found that your implementation uses Leaky ReLU instead of ReLU.
Does replacing ReLU into Leaky ReLU affect the results?
The text was updated successfully, but these errors were encountered: