Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Please explain trick 2 #11

Closed
mjdietzx opened this issue Jan 10, 2017 · 3 comments
Closed

Please explain trick 2 #11

mjdietzx opened this issue Jan 10, 2017 · 3 comments

Comments

@mjdietzx
Copy link

https://github.com/soumith/ganhacks#2-a-modified-loss-function

When training generator standard way is to pass [batch_generated_imgs, batch_real_imgs] and np.ones(shape=batch_size * 2) where all images are labelled 1 (real) to trick discriminator.

If I understand this trick correctly it is saying to pass [batch_generated_imgs, batch_real_imgs] and np.concat(np.ones(shape=batch_size), np.zeroes(shape=batch_size)) where labels are now flipped for fake and real?

@spurra
Copy link

spurra commented Jan 11, 2017

Trick 2 states that when training the generator, instead of minimising log(1-D(G(z))) you maximise log(D(G(z)) so that you get better gradients. This is because the discriminator usually performs better than the generator. This is most easily done when, for example in torch when using nn.CrossEntropyLoss, by assigning the synthetic samples the label 1 and training on that.

@soumith
Copy link
Owner

soumith commented Jan 12, 2017

@mjdietzx you are correct in your understanding

@soumith soumith closed this as completed Jan 12, 2017
@hellojialee
Copy link

Hi @mjdietzx @soumith, could you explain more about this. Should we always flip the label when training the Generator (if I understand correctly, the Discriminator is fixed at the same time). Thus, trick 2 [batch_real_imgs, np.zeroes(shape=batch_size] is actually destroying the Discriminator?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants