Skip to content
C++/Python Information theoretic analyses tools
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
baselines
demos
infotheory
tests
.gitignore
LICENSE
MANIFEST.in
README.md
setup.py

README.md

Introduction

Infotheory, written in C++, is a software package to perform information theoretic analysis, especially on high-dimensional data. While inferring the data distribution from samples, we've utilized sparse representations thus avoiding the catastrophic explosion of bin counts with increased dimensions. Moreover, this package enables better distribution estimation by employing averaged shifted histograms [1].

The following information theoretic quantities can be estimated using this tool as of now and follow this repo for more to come.

  1. Entropy [2]
  2. Mutual Information [3]
  3. Partial Information Decomposition measures [4]
    • Unique Information
    • Redundant Information
    • Synergistic Information

The package can be used in Python or C++. While the C++ headers should function well on all platforms, the python package has currently only been tested on MacOS and Linux.

Installation

pip install infotheory

On MacOS, upgrade to MacOS-Mojave and update Xcode. You might have to set these environment variables from your terminal.

export CXXFLAGS="-mmacosx-version-min=10.9"
export LDFLAGS="-mmacosx-version-min=10.9"

For C++, simply download InfoTools.h and VectorMatrix.h and include those header files in your code.

Usage

Using this package in your own code involves three steps.

See demos and website for sample programs on how to use this package.

See colab demo here

Contact

Created by Madhavun Candadai and Eduardo J. Izquierdo. Contact Madhavun at madvncv[at]gmail.com

References

  1. Scott, D. W. (1985). Averaged shifted histograms: effective nonparametric density estimators in several dimensions. The Annals of Statistics, 1024-1040.
  2. http://www.scholarpedia.org/article/Entropy#Shannon_entropy
  3. http://www.scholarpedia.org/article/Mutual_information
  4. Williams, P. L., & Beer, R. D. (2010). Nonnegative decomposition of multivariate information. arXiv preprint arXiv:1004.2515.
You can’t perform that action at this time.