Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add an Example: pix2pix #186

Closed
wants to merge 2 commits into from
Closed

Add an Example: pix2pix #186

wants to merge 2 commits into from

Conversation

us
Copy link
Contributor

@us us commented Apr 8, 2020

TODO:

  • : Input pipeline
  • : Generator and Discriminator Model
  • : Calculate gradient
  • : Jupyter notebook

I use TF pix2pix tutorial resource as a guide and I am stuck when calculating the gradient on train_step function. Can you help me?
Thank you!

@AlexeyG
Copy link
Collaborator

AlexeyG commented Apr 9, 2020

Hi @us

I am not familiar with the pix2pix model, so not sure if I can be of much help he, but I can try. Can you be more specific with what exactly you're stuck with?

Are you envisioning this as an example that would become a part of the canonical Flax examples? If so, please check our guidelines regarding adding new examples in examples/README.md.

@us us mentioned this pull request Apr 9, 2020
@us
Copy link
Contributor Author

us commented Apr 9, 2020

Thank you! I read that and opened the issue #193.

I need to get grad the loss function but it's the correct way of that? I duplicated code.
Loss function of pix2pix needs to two model and when getting the get grad of loss_fn, is it correct way? I am so confused.
How can I get grads of two models? please check this for understanding loss func and grad of pix2pix.
I hope you understand this.

(gen_total_loss, disc_loss), disc_grad = disc_grad_fn(generator_opt.target,
discriminator_opt.target)

new_gen_opt = generator_opt.apply_gradient(gen_grad)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This won't work as-is. You should have a single optimizer (because the optimizer keeps track of the model parameters). As you have it here, you're really not even optimizing the same model with these two calls to apply_gradient.

@avital
Copy link
Contributor

avital commented May 12, 2020

Given no response in the past 26 days, I'll close this now. If you get a fully working version of pix2pix, we'd be happy to link to it form https://github.com/google/flax/blob/master/examples/README.md, per our new policy of preferring linking to external repos rather than maintaining them as part of Flax core.

@avital avital closed this May 12, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants