Python package for information theory.
Branch: master
Clone or download
Latest commit 4e56081 Feb 14, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
binder try something else Jul 2, 2018
dit update a few things Feb 14, 2019
docs Finish off cohesion stuff Dec 21, 2018
examples update a few things Feb 14, 2019
joss_paper fix reference in joss paper May 31, 2018
site Add partial information decompositions (#126) Aug 22, 2017
.coveragerc don't test the super slow constructors Nov 7, 2018
.gitignore add more tests Nov 1, 2018
.gitmodules Remove submodule sphinx_rtd_theme. Nov 3, 2013
.landscape.yml don't try to install optional deps on landscape Apr 22, 2016
.prospector.yaml Base optimizer (#136) Feb 13, 2018
.pylintrc clean up pyline messages Mar 28, 2018
.travis.yml enable 3.7 the 'right' way Nov 8, 2018
CITATION add a citation file Jul 24, 2018
CONTRIBUTING.md DOC: add more detail to contributing (#148) May 22, 2018
CREDITS.rst Add ToC to credits. Mar 30, 2015
HISTORY.rst Update HISTORY.rst Mar 30, 2015
LICENSE.txt update license file a bit for automatic github detection. May 10, 2018
MANIFEST.in Release v1.0.0.dev12 Sep 28, 2017
README.rst fix typo (#150) Jul 19, 2018
appveyor.yml use a simpler example? Nov 5, 2018
fixcoverage.py go back to a more meaningful variable name Apr 16, 2016
flit.ini Update flit.ini Aug 4, 2017
nose2.cfg fix nose2 config Mar 8, 2015
pytest.ini Switch all testing to pytest. Aug 31, 2016
readthedocs.yml try again Jun 7, 2016
release.sh add a release script Nov 10, 2017
requirements.txt clean up CI stuff, and correct aspects of the readme Sep 29, 2017
requirements_lt33.txt simplify setup.py a bit Mar 15, 2015
requirements_optional.txt expand upon optional dependencies May 21, 2018
requirements_testing.txt Rate-distortion and information bottleneck (#139) Apr 27, 2018
setup.cfg try a better approach Apr 22, 2016
setup.py Base optimizer (#136) Feb 13, 2018

README.rst

dit is a Python package for information theory.

Continuous Integration Status Continuous Integration Status (windows) Test Coverage Status Code Health Requirements Status

Documentation Status dit chat Say Thanks!

JOSS Status DOI

Try dit live: Run `dit` live!

Introduction

Information theory is a powerful extension to probability and statistics, quantifying dependencies among arbitrary random variables in a way that is consistent and comparable across systems and scales. Information theory was originally developed to quantify how quickly and reliably information could be transmitted across an arbitrary channel. The demands of modern, data-driven science have been coopting and extending these quantities and methods into unknown, multivariate settings where the interpretation and best practices are not known. For example, there are at least four reasonable multivariate generalizations of the mutual information, none of which inherit all the interpretations of the standard bivariate case. Which is best to use is context-dependent. dit implements a vast range of multivariate information measures in an effort to allow information practitioners to study how these various measures behave and interact in a variety of contexts. We hope that having all these measures and techniques implemented in one place will allow the development of robust techniques for the automated quantification of dependencies within a system and concrete interpretation of what those dependencies mean.

Citing

If you use dit in your research, please cite it as:

@article{dit,
  Author = {James, R. G. and Ellison, C. J. and Crutchfield, J. P.},
  Title = {{dit}: a {P}ython package for discrete information theory},
  Journal = {The Journal of Open Source Software},
  Volume = {3},
  Number = {25},
  Pages = {738},
  Year = {2018},
  Doi = {https://doi.org/10.21105/joss.00738}
}

Basic Information

Documentation

http://docs.dit.io

Downloads

https://pypi.org/project/dit/

Dependencies

Optional Dependencies

  • colorama: colored column heads in PID indicating failure modes
  • cython: faster sampling from distributions
  • hypothesis: random sampling of distributions
  • matplotlib, python-ternary: plotting of various information-theoretic expansions
  • numdifftools: numerical evaluation of gradients and hessians during optimization
  • pint: add units to informational values
  • scikit-learn: faster nearest-neighbor lookups during entropy/mutual information estimation from samples

Install

The easiest way to install is:

pip install dit

Alternatively, you can clone this repository, move into the newly created dit directory, and then install the package:

git clone https://github.com/dit/dit.git
cd dit
pip install .

Note

The cython extensions are currently not supported on windows. Please install using the --nocython option.

Testing

$ git clone https://github.com/dit/dit.git
$ cd dit
$ pip install -r requirements_testing.txt
$ py.test

Code and bug tracker

https://github.com/dit/dit

License

BSD 3-Clause, see LICENSE.txt for details.

Implemented Measures

dit implements the following information measures. Most of these are implemented in multivariate & conditional generality, where such generalizations either exist in the literature or are relatively obvious --- for example, though it is not in the literature, the multivariate conditional exact common information is implemented here.

Entropies

  • Shannon Entropy
  • Renyi Entropy
  • Tsallis Entropy
  • Necessary Conditional Entropy
  • Residual Entropy / Independent Information / Variation of Information

Mutual Informations

  • Co-Information
  • Interaction Information
  • Total Correlation / Multi-Information
  • Dual Total Correlation / Binding Information
  • CAEKL Multivariate Mutual Information

Divergences

  • Variational Distance
  • Kullback-Leibler Divergence Relative Entropy
  • Cross Entropy
  • Jensen-Shannon Divergence
  • Earth Mover's Distance

Other Measures

  • Channel Capacity
  • Complexity Profile
  • Connected Informations
  • Cumulative Residual Entropy
  • Extropy
  • Hypercontractivity Coefficient
  • Information Bottleneck
  • Information Diagrams
  • Information Trimming
  • Lautum Information
  • LMPR Complexity
  • Marginal Utility of Information
  • Maximum Correlation
  • Maximum Entropy Distributions
  • Perplexity
  • Rate-Distortion Theory
  • TSE Complexity

Common Informations

  • Gacs-Korner Common Information
  • Wyner Common Information
  • Exact Common Information
  • Functional Common Information
  • MSS Common Information

Partial Information Decomposition

  • I_{min}
  • I_{\wedge}
  • I_{\downarrow}
  • I_{proj}
  • I_{BROJA}
  • I_{ccs}
  • I_{\pm}
  • I_{dep}
  • I_{RAV}

Secret Key Agreement Bounds

  • Intrinsic Mutual Information
  • Reduced Intrinsic Mutual Information
  • Minimal Intrinsic Mutual Information
  • Necessary Intrinsic Mutual Information
  • Secrecy Capacity

Quickstart

The basic usage of dit corresponds to creating distributions, modifying them if need be, and then computing properties of those distributions. First, we import:

>>> import dit

Suppose we have a really thick coin, one so thick that there is a reasonable chance of it landing on its edge. Here is how we might represent the coin in dit.

>>> d = dit.Distribution(['H', 'T', 'E'], [.4, .4, .2])
>>> print(d)
Class:          Distribution
Alphabet:       ('E', 'H', 'T') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 1
RV Names:       None

x   p(x)
E   0.2
H   0.4
T   0.4

Calculate the probability of H and also of the combination H or T.

>>> d['H']
0.4
>>> d.event_probability(['H','T'])
0.8

Calculate the Shannon entropy and extropy of the joint distribution.

>>> dit.shannon.entropy(d)
1.5219280948873621
>>> dit.other.extropy(d)
1.1419011889093373

Create a distribution where Z = xor(X, Y).

>>> import dit.example_dists
>>> d = dit.example_dists.Xor()
>>> d.set_rv_names(['X', 'Y', 'Z'])
>>> print(d)
Class:          Distribution
Alphabet:       ('0', '1') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 3
RV Names:       ('X', 'Y', 'Z')

x     p(x)
000   0.25
011   0.25
101   0.25
110   0.25

Calculate the Shannon mutual informations I[X:Z], I[Y:Z], and I[X,Y:Z].

>>> dit.shannon.mutual_information(d, ['X'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['Y'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['X', 'Y'], ['Z'])
1.0

Calculate the marginal distribution P(X,Z). Then print its probabilities as fractions, showing the mask.

>>> d2 = d.marginal(['X', 'Z'])
>>> print(d2.to_string(show_mask=True, exact=True))
Class:          Distribution
Alphabet:       ('0', '1') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 2 (mask: 3)
RV Names:       ('X', 'Z')

x     p(x)
0*0   1/4
0*1   1/4
1*0   1/4
1*1   1/4

Convert the distribution probabilities to log (base 3.5) probabilities, and access its probability mass function.

>>> d2.set_base(3.5)
>>> d2.pmf
array([-1.10658951, -1.10658951, -1.10658951, -1.10658951])

Draw 5 random samples from this distribution.

>>> dit.math.prng.seed(1)
>>> d2.rand(5)
['01', '10', '00', '01', '00']

Contributions & Help

If you'd like to feature added to dit, please file an issue. Or, better yet, open a pull request. Ideally, all code should be tested and documented, but pleast don't let this be a barrier to contributing. We'll work with you to ensure that all pull requests are in a mergable state.

If you'd like to get in contact about anything, you can reach us through our slack channel.