Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

model.eval() will change mean and var #23

Closed
luhc15 opened this issue Dec 18, 2017 · 2 comments
Closed

model.eval() will change mean and var #23

luhc15 opened this issue Dec 18, 2017 · 2 comments

Comments

@luhc15
Copy link

luhc15 commented Dec 18, 2017

when set model.eval(), the BN layers weights and bias are fixed, but var and mean will changed when finetuned, is there any influences if var and mean change or should set them fixed using momentum=0?

@omkar13
Copy link

omkar13 commented Apr 18, 2018

When using model.eval(), mean and variance are fixed to pretrained values. BN layer's weights and bias can be fixed using requires_grad=False.

@isht7
Copy link
Owner

isht7 commented Apr 22, 2018

In my model, I have set requres_grad = False for all batch-norm parameters. Thus weights of all BN layers remain fixed. model.eval() is also used, which keeps the running mean and var fixed.

@isht7 isht7 closed this as completed Apr 22, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants