You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a quick question on the GAN example, where 'detaching' the variable should be used in my opinion like fake_images = G(z).detach(), to avoid the training of Discriminator influences the Generator's behaviour at this stage. Or you have specific considerations here?
Thank you!
The text was updated successfully, but these errors were encountered:
@dolaameng No, it doesn't matter because we only train the discriminator using d_optimizer.step() and reset the grad buffer before calling g_loss.backward(). In the case of fake_images =Variable(G(z).detach().cuda()), it saves memory and speed up the training, but in my experience this is trivial.
Hi,
First thank you for sharing the tutorials!
I have a quick question on the GAN example, where 'detaching' the variable should be used in my opinion like
fake_images = G(z).detach()
, to avoid the training of Discriminator influences the Generator's behaviour at this stage. Or you have specific considerations here?Thank you!
The text was updated successfully, but these errors were encountered: