Hi,
Thank you for uploading the code.
I understand that EMAM is implemented in momentum_update(self, cur_iter, max_iter) method of BYOLEMAM class.
Since minibatch mean (mu) and variance(sigma^2) are not trainable parameters of the BN layer in PyTorch, does minibatch mean and variance gets 'momentum update' via running_mean and running_variance parameters of the BatchNorm layer?
Regards