-
Notifications
You must be signed in to change notification settings - Fork 341
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About ContaGan 2C loss #85
Comments
Hi, The additional temperature (t) is multiplied for a balance between adversarial training and class conditioning. However, as you can see in src/configs, all temperatures in ContraGAN.json are fixed at 1.0. Sorry for the late reply. Thank you. |
Thanks for you repling. But i still don't understand. |
Hello. I noticed that multiplying both the temperature and self.contrastive_lambda is redundant. I will revise the code in the main branch (actually it is already reflected in the renew_cfgs branch (worker.py line 293), which will officially be released at the end of this month). Thank you. |
OK.Thanks a lot |
this is 2c Loss, when i watch the code, i found the 2c loss code multiply a extra temperature.
denomerator = torch.cat([torch.unsqueeze(inst2proxy_positive, dim=1), instance_zone], dim=1).sum(dim=1)
criterion = -torch.log(temperature * (numerator / denomerator)).mean()
The text was updated successfully, but these errors were encountered: