You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I havent encountered in other projects this kind of updating values. Is it something originating from PSPNet or this really is the way to use momentum or is it something third?
Thanks,
Tamme
The text was updated successfully, but these errors were encountered:
Hey @Tamme, let me explain the batch normalization layer first. There are four variables in the batch normalization layer, and they are moving_mean, moving_variance, gamma, and beta respectively. And moving_mean and moving variance are not trainable variables, so we need to update them using update ops which is put in tf.GraphKeys.UPDATE_OPS, you can take a look at tensorflow docs. So I use the flag --update-mean-var to decide whether to update mean and var ( Since update them in large batch size is better, if we train in mini-batch, we can frozen these two variables for better reuslts).
Hi.
I havent encountered in other projects this kind of updating values. Is it something originating from PSPNet or this really is the way to use momentum or is it something third?
Thanks,
Tamme
The text was updated successfully, but these errors were encountered: