Estimating differential entropy and mutual information. See also the limiting density of discrete points as to why the original description of differential entropy is not even dimensinoally correct.
Originally adapted by G Varoquaux in a gist for code created by R Brette, itself from several papers (see in the code). These computations rely on nearest-neighbor (radial density) statistics.
- Kozachenko, L. F. & Leonenko, N. N. 1987 Sample estimate of entropy of a random vector. Probl. Inf. Transm. 23, 95-101. In particular see eq (20).
- Evans, D. 2008 A computationally efficient estimator for mutual information, Proc. R. Soc. A 464 (2093), 1203-1215
- Kraskov A, Stogbauer H, Grassberger P. (2004). Estimating mutual information. Phys Rev E 69(6 Pt 2):066138
- F. Perez-Cruz, (2008). Estimation of Information Theoretic Measures for Continuous Random Variables. Advances in Neural Information Processing Systems 21 (NIPS). Vancouver (Canada), December.
- Damiano Lombardi, Sanjay Pant (2016) A non-parametric k-nearest neighbor entropy estimator
TODO: equation 3 and 9 from 2008 nips paper
Mention dimensionality problems.
...
python setup.py install
or
pip install pypi
See Makefile
for example ops.
See https://pypi.org/project/mutual-info
Do not pin packages for now. Let's surf latest and find out when things break.
python setup.py develop
make test
- incorporate fixes from @thismartian (see thismartian branch)
- entropy tests/properties:
- monotonically decreasing in p?
- mutual infomation tests/properties
- triangle inequality