Skip to content

Putting GaelVaroquaux's mutual_info gist in a project until it has a better home.

License

Notifications You must be signed in to change notification settings

mutualinfo/mutual_info

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Mutual Information

reeeeeedme

Estimating differential entropy and mutual information for continuous random variables.

Perhaps see:

⚠️ A reminder about Differential Entropy

  • It is not invariant under monotonic changes of variables (on the individual r.v.s.), and is therefore most useful with dimensionless variables. The equivalent invariance for discrete is bijective (relabelling) transformations of the individual r.v.s.
  • It can be negative.

See also the limiting density of discrete points as to why the original description of differential entropy is not even dimensionally correct.

Mutual Information

...

Install

python setup.py install

or

pip install pypi

Development

See Makefile for example ops.

See https://pypi.org/project/mutual-info

Do not pin packages for now. Let's surf latest and find out when things break.

Develop install

python setup.py develop

Tests

make test

TODO

  • should do some rank transform of the data or something to make it invariant to monotonic transforms of the data.
  • equation 3 and 9 from 2008 nips paper
  • tests
  • clear documentation and reminders about mutual information and the problems with continuous r.v.s
  • compare to sklearn _mutual_info.py as per #2

Origins

Originally adapted by G Varoquaux in a gist for code created by R Brette, itself from several papers (see in the code). These computations rely on nearest-neighbor (radial density) statistics.

About

Putting GaelVaroquaux's mutual_info gist in a project until it has a better home.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages