Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why does netG generate different data for same input? #29

Closed
arunpatro opened this issue Sep 3, 2018 · 1 comment
Closed

Why does netG generate different data for same input? #29

arunpatro opened this issue Sep 3, 2018 · 1 comment

Comments

@arunpatro
Copy link

In many places, the fake images are generated via:

fake_imgs, _, _, _ = netG(noise, sent_emb, words_embs, mask)

The netG is of class G_NET defined in https://github.com/taoxugit/AttnGAN/blob/master/code/model.py#L397.

When I keep noise, sent_emb, words_embs and mask constant and rerun the generation, I get different fake images. Shouldn't the model be outputting a constant output for a constant input? Is there any stochastic behaviour of the G_NET?

@arunpatro arunpatro changed the title Why does netG generate different data for same input? Why does netG generate different data for same input? Sep 3, 2018
@taoxugit
Copy link
Owner

taoxugit commented Oct 4, 2018

The randomness is from class CA_NET, which is for Conditioning Augmentation proposed by this paper.

class CA_NET(nn.Module):
# some code is modified from vae examples
# (https://github.com/pytorch/examples/blob/master/vae/main.py)
def init(self):
super(CA_NET, self).init()
self.t_dim = cfg.TEXT.EMBEDDING_DIM
self.c_dim = cfg.GAN.CONDITION_DIM
self.fc = nn.Linear(self.t_dim, self.c_dim * 4, bias=True)
self.relu = GLU()

def encode(self, text_embedding):
    x = self.relu(self.fc(text_embedding))
    mu = x[:, :self.c_dim]
    logvar = x[:, self.c_dim:]
    return mu, logvar

def reparametrize(self, mu, logvar):
    std = logvar.mul(0.5).exp_()
    if cfg.CUDA:
        eps = torch.cuda.FloatTensor(std.size()).normal_()
    else:
        eps = torch.FloatTensor(std.size()).normal_()
    eps = Variable(eps)
    return eps.mul(std).add_(mu)

def forward(self, text_embedding):
    mu, logvar = self.encode(text_embedding)
    c_code = self.reparametrize(mu, logvar)
    return c_code, mu, logvar

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants