New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the Intrinsic Information #15

Closed
Autoplectic opened this Issue Sep 29, 2013 · 1 comment

Comments

Projects
None yet
1 participant
@Autoplectic
Copy link
Member

Autoplectic commented Sep 29, 2013

aka the intrinsically conditional mutual information.

This requires some work, as it utilizes a maximization over markov chains.

@Autoplectic Autoplectic self-assigned this Jun 11, 2016

@Autoplectic

This comment has been minimized.

Copy link
Member Author

Autoplectic commented Jun 11, 2016

So this generalizes in a few obvious ways: each of the different mutual informations.

B(X1:X2:X3... |v Zs) = min B(X1:X2:X3... | Zs_bar) over all Zs_bar such that Xs - Zs - Zs_bar are a markov chain. Ditto for the total correlation and the CAEKL mutual information.

the co-information is not so obvious... do we min over I(...), producing the most negative thing? do we min over abs(I(...)), presumably finding the most independent thing? i would assume the latter, but it isn't completely obvious.

I'll get these coded soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment