New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the Exact Common Information #11

Closed
Autoplectic opened this Issue Sep 29, 2013 · 2 comments

Comments

Projects
None yet
1 participant
@Autoplectic
Copy link
Member

Autoplectic commented Sep 29, 2013

To the best of my knowledge this is not in the literature anywhere. Is equivalent to the generative complexity in computational mechanics:

$\min{ H[Z] : B[{X_i} | Z] = 0 }$

This is similar to the Wyner common info, but it is the entropy of the random variable rather than its mutual information with the variables.

@Autoplectic

This comment has been minimized.

Copy link
Member Author

Autoplectic commented Aug 5, 2014

some hints on how to compute this (at least in the two variable case) can be found here:
http://arxiv.org/abs/1402.0062

@Autoplectic Autoplectic changed the title Add the Minimal Markov Chain Information Add the Exact Common Information Feb 23, 2015

@Autoplectic

This comment has been minimized.

Copy link
Member Author

Autoplectic commented Feb 23, 2015

Note that this is in general a non-convex optimization and so is extra difficult.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment