You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when set model.eval(), the BN layers weights and bias are fixed, but var and mean will changed when finetuned, is there any influences if var and mean change or should set them fixed using momentum=0?
The text was updated successfully, but these errors were encountered:
In my model, I have set requres_grad = False for all batch-norm parameters. Thus weights of all BN layers remain fixed. model.eval() is also used, which keeps the running mean and var fixed.
when set model.eval(), the BN layers weights and bias are fixed, but var and mean will changed when finetuned, is there any influences if var and mean change or should set them fixed using momentum=0?
The text was updated successfully, but these errors were encountered: