-
Notifications
You must be signed in to change notification settings - Fork 468
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does it make sense to do spectral normalization before batch normalization? #9
Comments
@richardwth Did you end up experimenting with any other order? |
but i think |
No... std(x * c) = std(x) * c. |
And why was the 2 used even at the same time. I dont seem to understand why Spectral Norm and BatchNorm are used together. Can someone explain? |
They are orthogonal by their effect. SN changes weights of the layer, BN changes activations. |
Hello! Thank you for your contribution to generative adversarial network research and for sharing your code! I am from China. Now is the Chinese New Year. I wish you a happy Chinese New Year! I am very interested in your thesis, when I try to add spectral normalization in my new networks, the program gives the following error: I have searched a lot of information and couldn't solve it, so I want to ask you, I wish you a happy life, and look forward to your reply! |
Would the spectral norm get canceled out because it appears on both the nominator and denominator of the batch normalization equation?
I mean:
bn(x*w/sn(w)) = gamma * (x*w/sn(w) - mean(x*w/sn(w))) / std(x*w/sn(w)) + beta = bn(x*w)
The text was updated successfully, but these errors were encountered: