Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Something seems wrong #2

Open
BaochangZhang opened this issue Nov 25, 2021 · 1 comment
Open

Something seems wrong #2

BaochangZhang opened this issue Nov 25, 2021 · 1 comment

Comments

@BaochangZhang
Copy link

In your model:
the Generator_B is defined as :
self.netG_B = networks.define_G(opt.output_nc, opt.input_nc, opt.ngf, opt.netG, opt.norm,
not opt.no_dropout, opt.init_type, opt.init_gain, self.gpu_ids,thres=True).
so the output range of self.netG_B should be [0, 1].
I find that self.real_A, self.real_C, self.real_B have been normalized to [-1, 1].
In the forward part:
self.fake_B = self.netG_A(self.real_A)(self.real_A+1)+(1-self.real_A)(self.real_C+1)/2-1 # G_A(A) tanh(-1,1)#self.real_Aself.real_C
self.rec_A = self.netG_B(self.fake_B) # G_B(G_A(A)) -->[0, 1]
self.fake_A = self.netG_B(self.real_B) # G_B(B)-->[0, 1]
_**self.rec_B = self.netG_A(self.fake_A)
(self.fake_A+1)+(1-self.fake_A)*(self.real_B+1)/2-1 # G_A(G_B(B))#self.real_B*_
the last line seems wrong, the value range of self.real_A, and self.fake_A are different.
I can not understand. looking forward your reply.

@Sam291998
Copy link

@BaochangZhang Hello sir, did you get any updates regarding this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants