Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GAN训练 #51

Closed
SimKarras opened this issue Aug 25, 2021 · 2 comments
Closed

GAN训练 #51

SimKarras opened this issue Aug 25, 2021 · 2 comments

Comments

@SimKarras
Copy link

SimKarras commented Aug 25, 2021

您好,有一个问题困扰我。在GAN的训练中,参考https://github.com/rosinality/stylegan2-pytorch/blob/master/train.py
在训练Discriminator时,rosinality将Generator的梯度更新关闭:

        requires_grad(generator, False)
        requires_grad(discriminator, True)

同样,训练Generator时,也会将Discriminator的梯度更新关闭:

        requires_grad(generator, True)
        requires_grad(discriminator, False)

我只在您的代码中找到了对Discriminator进行梯度控制,没有对Generator的梯度调节:

        for p in self.net_d.parameters():
            p.requires_grad = False

&

        for p in self.net_d.parameters():
            p.requires_grad = True

1、这是不是意味着Generator始终会得到梯度更新(哪怕是在训练Discriminator时)?如果是这样,是否等价于每份数据都会在Generator前降传播两次呢?
2、如果Generator的梯度更新也会受到调节,请问这是在哪个位置实现的呢?

@xinntao
Copy link
Member

xinntao commented Aug 29, 2021

当我们在更新 D的时候, G的输出进行了detach操作,所以是没有梯度传递到G的
image

@SimKarras
Copy link
Author

我明白了,十分感谢您的解惑!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants