Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jensen–Shannon divergence #20

Closed
LamyaMohaned opened this issue Jun 14, 2021 · 2 comments
Closed

Jensen–Shannon divergence #20

LamyaMohaned opened this issue Jun 14, 2021 · 2 comments

Comments

@LamyaMohaned
Copy link

LamyaMohaned commented Jun 14, 2021

Hello,

I'm trying to understand Jensen–Shannon divergence, I still don't understand the math behind it, but someone asked me to investigate about it and Augmix because of this paragraph:

Alternatively, we can view each set as an empirical distribution and measure the distance between
them using Kullback-Leibler (KL) or Jensen-Shannon (JS) divergence. The challenge for learning
with KL or JS divergence is that no useful gradient is provided when the two empirical distributions
have disjoint supports or have a non-empty intersection contained in a set of measure zero.

from here: https://arxiv.org/pdf/1907.10764.pdf

Is this problem presented in Augmix?

@hendrycks
Copy link
Contributor

hendrycks commented Jun 14, 2021

This is not a problem with AugMix since they share the same support and for all elements of the support, the probabilities are greater than zero.

@LamyaMohaned
Copy link
Author

Thank you!

@normster normster closed this as completed Sep 6, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants