Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mutual Information loss H #25

Closed
connorlee77 opened this issue Mar 18, 2020 · 2 comments
Closed

Mutual Information loss H #25

connorlee77 opened this issue Mar 18, 2020 · 2 comments

Comments

@connorlee77
Copy link

Can you be a bit more transparent about how you calculate the mutual information loss? Particularly H. How did you guys calculate the joint histogram?

@ekhahniii
Copy link

Hello, I too am curious about the formulation for calculating the joint histogram. Maybe you could comment a bit on the motivation for this line:

p_joint = th.mm(p_f, p_m.transpose(0, 1)).div(self._normalizer_2d)

@ChristophJud @RobinSandkuehler

@connorlee77
Copy link
Author

@ekhahniii the histograms are calculated via kernel density estimation. If you write out the multivariate version with a gaussian kernel, you'll notice you can decompose it into the product of two univariate kernels (ignoring any scaling terms). The matrix multiply operation combines this product as well as the summation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants