New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch Normalization Momentum? #695

Closed
n3011 opened this Issue Mar 7, 2016 · 3 comments

Comments

Projects
None yet
3 participants
@n3011

n3011 commented Mar 7, 2016

For the batch normalization layer what does the line "The running sum is kept with a default momentum of 0.1" means. How does momentum is used here to updates mean and variance?

@colesbury

This comment has been minimized.

Show comment
Hide comment
@colesbury

colesbury Mar 7, 2016

Contributor

The running mean and variances are computed using an exponential moving average with smoothing factor 0.1.

i.e running_mean[t] = running_mean[t-1]*0.9 + batch_mean[t]*0.1

Contributor

colesbury commented Mar 7, 2016

The running mean and variances are computed using an exponential moving average with smoothing factor 0.1.

i.e running_mean[t] = running_mean[t-1]*0.9 + batch_mean[t]*0.1

@n3011

This comment has been minimized.

Show comment
Hide comment
@n3011

n3011 Mar 7, 2016

Thanks @colesbury, its really helpful.

n3011 commented Mar 7, 2016

Thanks @colesbury, its really helpful.

@n3011 n3011 closed this Mar 7, 2016

@Atcold

This comment has been minimized.

Show comment
Hide comment
@Atcold

Atcold Mar 7, 2016

Contributor

@n3011, perhaps you'd like to improve the documentation with @colesbury's answer 😬

Contributor

Atcold commented Mar 7, 2016

@n3011, perhaps you'd like to improve the documentation with @colesbury's answer 😬

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment