Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bugs in Discriminator (batch dimension and Linear layer output dim) #3

Open
itakatz opened this issue Oct 30, 2023 · 0 comments
Open

Comments

@itakatz
Copy link

itakatz commented Oct 30, 2023

It seems to me that the implementation of the Discriminator class has 2 issues:

  1. The RNN should have the input parameter batch_first set to True, but it is not set (and defaults to False):
    self.rnn = nn.GRU(input_size=opt.hidden_dim, hidden_size=opt.hidden_dim, num_layers=opt.num_layer)
  2. The output dimension of the Linear layer should be 1 as this is a binary classification block (predicting fake or not), but it is set to the hidden dimension:
    self.fc = nn.Linear(opt.hidden_dim, opt.hidden_dim)

    this does not produce an error, since later the loss function averages its input. But it means the calculation is not as it shold be.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant