Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conditional Entropy #9

Closed
dglmoore opened this issue Aug 1, 2016 · 0 comments
Closed

Conditional Entropy #9

dglmoore opened this issue Aug 1, 2016 · 0 comments

Comments

@dglmoore
Copy link
Contributor

dglmoore commented Aug 1, 2016

Inform added conditional entropy in the v0.0.5 release. The next release of PyInform should include a wrapper for conditional entropy.

Proposed API

def conditional_entropy(xs, ys, bx=0, by=0, base=2.0, local=False):

Example Usage

from pyinform.conditionalentropy import conditional_entropy

xs = [0,0,1,1,1,1,0,0,0]
ys = [1,0,0,1,0,0,1,0,0]

conditional_entropy(xs, ys) # == 0.899985
conditional_entropy(ys, xs) # == 0.972765

conditional_entropy(xs, ys, local=True)
# == [1.322, 0.737, 0.415, 2.000, 0.415, 0.415, 1.322, 0.737, 0.737]
conditional_entropy(ys, xs, local=True)
# == [0.585, 1.000, 1.000, 1.585, 1.000, 1.000, 0.585, 1.000, 1.000]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant