Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support real global batch normalisation #281

Closed
luomai opened this issue May 7, 2020 · 2 comments
Closed

Support real global batch normalisation #281

luomai opened this issue May 7, 2020 · 2 comments
Assignees

Comments

@luomai
Copy link
Member

luomai commented May 7, 2020

Tensorflow 2.2 supports the following feature:

Support added for global sync BatchNormalization by using the newly added tf.keras.layers.experimental.SyncBatchNormalization layer. This layer will sync BatchNormalization statistics every step across all replicas taking part in sync training.

It would be great if KungFu can support it as well.

@luomai
Copy link
Member Author

luomai commented May 7, 2020

@lgarithm @marwage @fertakis

@luomai luomai self-assigned this May 7, 2020
@luomai
Copy link
Member Author

luomai commented Jun 26, 2020

I recently had a discussion with my friend in Google. It seems like people are moving to use Group Normalisation or other Normalisation approaches instead of using Batch Norm because Batch Norm is indeed difficult synchronise in a distributed setting. Also, Batch Norm is difficult to be applied in a RL scenario because it out-weight the samples in an online trajectory.

@luomai luomai closed this as completed Jul 4, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant