Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorrect discriminator update for opt.use_same_D #93

Closed
SanderGielisse opened this issue Dec 6, 2020 · 1 comment
Closed

Incorrect discriminator update for opt.use_same_D #93

SanderGielisse opened this issue Dec 6, 2020 · 1 comment

Comments

@SanderGielisse
Copy link

Hi, I believe there is a little mistake in the way the use_same_D option is processed. In case this option is enabled, in bicycle_gan_model.py, self.backward_D is called twice, both on self.netD. Since the backward call is done twice then, without retain_graph=True, the first backward pass is overwritten, thus D is only updated on self.real_data_random and self.fake_data_random, not on self.real_data_encoded and self.fake_data_encoded. Either retain_graph=True should be used, or the backward pass should incorporate both losses like, (self.loss_D + self.loss_D2).backward()

if self.opt.use_same_D:

@SanderGielisse
Copy link
Author

My bad, it seems that when backward() is called twice, it accumulates into .grad, so the implementation is correct. Closing this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant