Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the consistency loss #2

Closed
yakexee opened this issue Aug 18, 2021 · 2 comments
Closed

About the consistency loss #2

yakexee opened this issue Aug 18, 2021 · 2 comments

Comments

@yakexee
Copy link

yakexee commented Aug 18, 2021

Hi, thanks for sharing your code. For the function F.kl_div(), the first parameter is input and the second is target. I am confused why the target is not p_mixture on L401-L403?
Thanks.

@wildphoton
Copy link
Owner

wildphoton commented Aug 23, 2021

Hi @yakexee, I think F.kl_div() defines inputs and targets in the way how NLL loss is used. Since NLL(input, target) = cross_entropy(target, input). I believe F.kl_div(input, target) = KL(target || input). You can check this with an easy example by comparing the results with a KL div function implemented by yourself. Since we are computing KL(p_aug || p_mixture), we called F.kl_div(p_mixture, p_aug). FYI, this loss is adapted from the Augmix's implementation. I hope it helps

@yakexee
Copy link
Author

yakexee commented Aug 24, 2021

Thanks for the kind reply. That helps!

@yakexee yakexee closed this as completed Aug 24, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants