You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, thanks for your excellent paper and code!
I have some concerns on the non-linear functions that are used in the ResNet and Mobilenet. Could you please provide more details?
In the Section 4 of the paper, the authors claim that "they use RPReLU [32] as non-linear function":
However, in this code, it seems that you use PReLU for Resnet and ReLU6 for MobileNet. May this difference affect the accuracy of quantized models seriously?
The text was updated successfully, but these errors were encountered:
Hello, thanks for your excellent paper and code! I have some concerns on the non-linear functions that are used in the ResNet and Mobilenet. Could you please provide more details? In the Section 4 of the paper, the authors claim that "they use RPReLU [32] as non-linear function":
However, in this code, it seems that you use PReLU for Resnet and ReLU6 for MobileNet. May this difference affect the accuracy of quantized models seriously?
Ok, I got it. The authors use two additional LearnableBias parameters to implement the RPReLU as follows:
Hello, thanks for your excellent paper and code!
I have some concerns on the non-linear functions that are used in the ResNet and Mobilenet. Could you please provide more details?
In the Section 4 of the paper, the authors claim that "they use RPReLU [32] as non-linear function":
However, in this code, it seems that you use PReLU for Resnet and ReLU6 for MobileNet. May this difference affect the accuracy of quantized models seriously?
The text was updated successfully, but these errors were encountered: