Source code for the EnDive estimator of
Divergence functionals are integral functionals of two probability distributions. Divergence functionals play a big role in applications in machine learning, information theory, statistics, and signal processing. In particular, divergence functionals can be related to the best probability of error rate of a classification problem. Divergences are also often used to measure dissimilarity between probability distributions.
While the paper referenced above covers general divergence functionals, this code is written for
where
EnDive computes an ensemble of kernel density estimators (KDE) of the densities
The bandwidths can be provided by the user. Otherwise, the default is to compute the set of bandwidths based on the
If you find this work useful, please cite:
@article{moon2018endive,
title={Ensemble estimation of information divergence},
author={Moon, Kevin R and Sricharan, Kumar and Greenewald, Kristjan and Hero, Alfred O},
journal={Entropy},
year={2018},
volume={20},
number={8},
pages={560}
}
Other related papers that may be of interest:
[1] K.R Moon, K. Sricharan, K. Greenewald, A.O. Hero III, "Improving convergence of divergence functional ensemble estimators," IEEE International Symposium on Information Theory (ISIT), pp. 1133-1137, July 2016.
[2] K.R. Moon, V. Delouille, and A.O. Hero III, "Meta learning of bounds on the Bayes classifier error," IEEE Signal Processing and SP Education Workshop, pp. 13-18, Aug. 2015.