Skip to content

In probability theory and information theory, the mutual information of two random variables is a quantity that measures the mutual dependence of the two random variables. This script performs MI over Mutual Information over discrete random variables

Notifications You must be signed in to change notification settings

rmaestre/Mutual-Information

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 

Repository files navigation

Roberto maestre - rmaestre@gmail.com
Bojan Mihaljevic - boki.mihaljevic@gmail.com

https://controls.engin.umich.edu/wiki/index.php/Correlation_and_Mutual_Information

Mutual information (also referred to as transinformation) is a quantitative measurement of how much one random variable (Y) tells us about another random variable (X). In this case, information is thought of as a reduction in the uncertainty of a variable. Thus, the more mutual information between X and Y, the less uncertainty there is in X knowing Y or Y knowing X. For our purposes, within any given process, several parameters must be selected in order to properly run the process. The relationship between variables is integral to correctly determine working values for the system. For example, adjusting the temperature in a reactor often causes the pressure to change as well. Mutual information is most commonly measured in logarithms of base 2 (bits) but is also found in base e (nats) and base 10 (bans).


data:
[ (0, 0, 1, 1, 0, 1, 1, 2, 2, 2),
  (3, 4, 5, 5, 3, 2, 2, 6, 6, 1),
  (7, 2, 1, 3, 2, 8, 9, 1, 2, 0),
  (7, 7, 7, 7, 7, 7, 7, 7, 7, 7),
  (0, 1, 2, 3, 4, 5, 6, 7, 1, 1) ]


./it_tool.py

Entropy(X_1): 0.759176
Elapsed time: 0.000941

Entropy(X_3): 0.000000
Elapsed time: 0.000046

Entropy(X_4): 0.856864
Elapsed time: 0.000247

Entropy(X_0, X_1): 0.759176
Elapsed time: 0.000639

Entropy(X_3, X_3): 0.000000
Elapsed time: 0.000082

MI(X_0, X_1): 0.472903
Elapsed time: 0.001174

MI(X_1, X_2): 0.555834
Elapsed time: 0.002696

About

In probability theory and information theory, the mutual information of two random variables is a quantity that measures the mutual dependence of the two random variables. This script performs MI over Mutual Information over discrete random variables

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages