You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Support added for global sync BatchNormalization by using the newly added tf.keras.layers.experimental.SyncBatchNormalization layer. This layer will sync BatchNormalization statistics every step across all replicas taking part in sync training.
It would be great if KungFu can support it as well.
The text was updated successfully, but these errors were encountered:
I recently had a discussion with my friend in Google. It seems like people are moving to use Group Normalisation or other Normalisation approaches instead of using Batch Norm because Batch Norm is indeed difficult synchronise in a distributed setting. Also, Batch Norm is difficult to be applied in a RL scenario because it out-weight the samples in an online trajectory.
Tensorflow 2.2 supports the following feature:
It would be great if KungFu can support it as well.
The text was updated successfully, but these errors were encountered: