-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About BN parameters #12
Comments
Yes, we had observed the same degradation without fixing the BN parameters. This is the reason why we fix the parameters of Batch Normalization after 10000 episodes. We don't know the true reason of this phenomenon yet. We guess BN will affect the generalization performance of models. Actually, the research on BN is very hot recently. Thanks. |
Thanks a lot for your insightful analysis. And do you think other metric-based few-shot classification models, i.e., Protonet, can also benefit from this fixed-parameter setting? |
You are welcome. Yes, we had reimplemented Protonet based on the same framework, and we found this setting can gain about 1% performance improvement. |
In your code I see that you fix the parameters of Batch Normalization after 1 epoch (10000 episodes), but when I remove the constraint (i.e., before training use model.train(), and before val/test use model.eval()), the performance will drop sharply.
Have you observed the same degradation w/o fixing the BN params? and why is that?
The text was updated successfully, but these errors were encountered: