Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why not change the training mode to 'False' in the line # 339? #9

Closed
zem007 opened this issue Oct 16, 2018 · 1 comment
Closed

Why not change the training mode to 'False' in the line # 339? #9

zem007 opened this issue Oct 16, 2018 · 1 comment

Comments

@zem007
Copy link

zem007 commented Oct 16, 2018

Hi, thanks for your sharing. I learned a lot!
I am little confused for the training mode in the Model.py (line#: 339). To get the dice for the validation set, we should change the training mode from 'True' to 'False', is that right? Because we we have trained the weights by the training set, and we should not train any weights by the validation set.

I am a new learner. I will be very grateful if you reply me.
Thanks a lot!

@HasnainRaz
Copy link
Owner

HasnainRaz commented Oct 16, 2018

Thanks for the interest, the variable "training" only denotes under what mode should the dropout and the batch_norm layers work, it doesn't enable or disable training. If it is set to true, batch_norm's parameters are calculated for each batch, if it is set to false, the batch_norm parameters learned during training are used, the author's in the original paper say that they set this to True even when evaluating.

You can of course set it to False, and you are correct that it should be False, but I set it True because in the paper the authors set it True just because it gives better results. Which might not be in your case. You can play around with this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants