New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Models' forward are incorrect #2
Comments
Hey, I checked the code carefully today. |
Hi,
My bad regarding the receptive field.
If you pushed the changes to repo I would love to check it out.
Thanks
…On Fri, Mar 17, 2017 at 8:14 AM, mrzhu ***@***.***> wrote:
Hey, I checked the code carefully today.
The model construction seems to have no wrong.
It's structure is the same as the original lua torch code.
A little mistake is that I forgot a leaky_relu layer after h4 in netD.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#2 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ADR7VdVmyK58N_7quK-0FWZ8LfBXQikAks5rmnkYgaJpZM4MfaB2>
.
--
Kind regards
Sagar M Waghmare
|
Nice. Thanks. Let me know if the results improve. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hey just realized that the model forward/sequence is wrong.
E.g
Current code:
input is (nc) x 256 x 256
This should be
input is (nc) x 256 x 256
Hence both the discriminator and generator models are incorrect.
The text was updated successfully, but these errors were encountered: