Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regularization for BiGAN, KLpq, GAN and WGAN inferences #817

Merged
merged 1 commit into from Jan 8, 2018
Merged

Regularization for BiGAN, KLpq, GAN and WGAN inferences #817

merged 1 commit into from Jan 8, 2018

Conversation

siddharth-agrawal
Copy link
Contributor

@siddharth-agrawal siddharth-agrawal commented Jan 6, 2018

Adds regularization in the loss functions for BiGAN, KLpq, GAN and WGAN inferences. This makes further progress towards fixing #529.

@siddharth-agrawal
Copy link
Contributor Author

@dustinvtran I'm not confident about the change I made in klpq.py. Specifically about the fact that I'm subtracting from the loss instead of adding to it. I did this because the gradients are being taken for -loss.

@dustinvtran
Copy link
Member

What you did is correct for both variational and model parameters in klpq.

@@ -90,10 +93,14 @@ def build_loss_and_gradients(self, var_list):
list(range(1, gradients.shape.ndims))))
penalty = self.penalty * tf.reduce_mean(tf.square(slopes - 1.0))

reg_terms_d = tf.losses.get_regularization_losses(scope="Disc")
reg_terms_all = tf.losses.get_regularization_losses()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reason this one and GANInference uses all but BiGAN's uses scope="Gen"?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I went by how variable lists are created in the respective files. For GAN and WGAN, scope="Gen" is not used for creating var_list so I assumed it wasn't there.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense. Merging now. All looks great.

@dustinvtran dustinvtran merged commit 071a4ba into blei-lab:master Jan 8, 2018
@siddharth-agrawal siddharth-agrawal deleted the variational_regularization_1 branch January 8, 2018 20:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants