New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the Jensen-Shannon Divergence #26

Merged
merged 4 commits into from Oct 2, 2013

Conversation

Projects
None yet
2 participants
@Autoplectic
Copy link
Member

Autoplectic commented Oct 1, 2013

Depends on #25

def JSD(dists, weights=None):
    """
    The Jensen-Shannon Divergence: H( sum(w_i*P_i) ) - sum(w_i*H(P_i)).

    Parameters
    ----------
    dists: [Distribution]
        The distributions, P_i, to take the Jensen-Shannon Divergence of.

    weights: [float]
        The weights, w_i, to give the distributions.

    Returns
    -------
    jsd: float
        The Jensen-Shannon Divergence

    Raises
    ------
    CMPyException:
        Raised if the mixture distribution can not be formed.
    """
    if weights is None:
        weights = [ 1/len(dists) ] * len(dists)

    return mixture_distribution(dists, weights).entropy() - sum( w*d.entropy() for w, d in zip(weights, dists) )

Autoplectic added some commits Oct 1, 2013

Add the Jensen-Shannon Divergence
add the JSD, probably the best measure of distribution distinguishability.
@Autoplectic

This comment has been minimized.

Copy link
Member Author

Autoplectic commented Oct 1, 2013

I think this is good to go now.

chebee7i added a commit that referenced this pull request Oct 2, 2013

Merge pull request #26 from Autoplectic/jsd
Add the Jensen-Shannon Divergence

@chebee7i chebee7i merged commit 6d3b121 into dit:master Oct 2, 2013

1 check passed

default The Travis CI build passed
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment