-
Notifications
You must be signed in to change notification settings - Fork 91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gradient Penalty is very high in the start #46
Comments
@inspirit hey, so i also noticed that. however, i believe when i turned off the multi-scale (one of the main contributions of the paper), the gradients resembled stylegan, so i attributed it to that did you close the issue because your training was successful? |
hey! i closed it because the results were fine, but i did not train it for long, i'm borrowing different ideas from various GAN implementations for some experiments. GigaGan discriminator is definitely not the most stable and D/MSD/GP are fluctuating a lot |
@inspirit yes, i think it is the multi-scale causing it, last i was training on my toy datasets. if you turn it off by setting the loss weight to 0, the training resembles stylegan a lot (could be remembering wrong) |
so whats the main purpose of having Multi Scale Discriminator in this case? |
@inspirit i think it is necessary to achieve the paper's results |
@inspirit are you a phd student btw? we run into each other a lot |
Haha I'm long time past that age i believe :) Just freelance researcher and technical adviser for CV/ML stuff |
@inspirit ah got it :) well, glad to have you reviewing my code from time to time! |
in my opinion adaptive mod-conv is more useful and can be fitted in lots of ideas... |
@inspirit yea i think both may be important |
Hi!
i was running few experiments and noticed that GP is extremely hight in first few 100 steps.
GP > 60000, and then gradually going down to around GP = 20
is it normal behaviour? In my previous experience with StyleGan GP was small in the beginning
The text was updated successfully, but these errors were encountered: