Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is the differenece between dataset_aug_prob and aug_prob? #42

Closed
Dok11 opened this issue Jan 8, 2021 · 6 comments
Closed

What is the differenece between dataset_aug_prob and aug_prob? #42

Dok11 opened this issue Jan 8, 2021 · 6 comments

Comments

@Dok11
Copy link
Contributor

Dok11 commented Jan 8, 2021

No description provided.

@lucidrains
Copy link
Owner

lucidrains commented Jan 8, 2021

So the dataset one augments the images non-differentiably at the start, while the other augments everything going into the discriminator, differentiably, generated or not

@Dok11
Copy link
Contributor Author

Dok11 commented Jan 8, 2021

Is im right? dataset_aug_prob — for generator and discriminator, aug_prob — for discriminator only?
And why dataset_aug_prob qualt to 0 by default? Is it reccomend value?

@woctezuma
Copy link

woctezuma commented Jan 8, 2021

The recommendation from the following paper is to use differentiable augmentations.

S. Zhao, Z. Liu, J. Lin, J.-Y. Zhu, and S. Han. Differentiable augmentation for data-efficient GAN training.CoRR, abs/2006.10738, 2020

From the abstract:

The performance of generative adversarial networks (GANs) heavily deteriorates given a limited amount of training data. This is mainly because the discriminator is memorizing the exact training set. To combat it, we propose Differentiable Augmentation (DiffAugment), a simple method that improves the data efficiency of GANs by imposing various types of differentiable augmentations on both real and fake samples. Previous attempts to directly augment the training data manipulate the distribution of real images, yielding little benefit; DiffAugment enables us to adopt the differentiable augmentation for the generated samples, effectively stabilizes training, and leads to better convergence.

https://github.com/mit-han-lab/data-efficient-gans

@Dok11
Copy link
Contributor Author

Dok11 commented Jan 9, 2021

If it so helpuful why default value is zero? https://github.com/lucidrains/lightweight-gan/blob/main/lightweight_gan/cli.py#L100

@lucidrains
Copy link
Owner

@Dok11 what woc is saying is that the new paper says the dataset augmentations are not helpful

We should be using the differentiable one (the one you have been working on) as much as possible

@Dok11
Copy link
Contributor Author

Dok11 commented Jan 9, 2021

Thank you, now thats clear! =)

@Dok11 Dok11 closed this as completed Jan 9, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants