skip to navigation
skip to content

dit 1.0.1

Python package for information theory.

dit is a Python package for information theory.

Documentation:
http://docs.dit.io
Downloads:
https://pypi.org/project/dit/
Dependencies:
  • Python 2.7, 3.3, 3.4, 3.5, or 3.6
  • boltons
  • contextlib2
  • debtcollector
  • networkx
  • numpy
  • prettytable
  • scipy
  • six
Optional Dependencies:
  • colorama
  • cython
  • numdifftools
  • scikit-learn
Note:
The cython extensions are currently not supported on windows. Please install using the --nocython option.
Install:

The easiest way to install is:

pip install dit

Alternatively, you can clone this repository, move into the newly created dit directory, and then install the package:

git clone https://github.com/dit/dit.git
cd dit
pip install .
Mailing list:
None
Code and bug tracker:
https://github.com/dit/dit
License:
BSD 2-Clause, see LICENSE.txt for details.

Implemented Measures

dit implements the following information measures. Most of these are implemented in multivariate & conditional generality, where such generalizations either exist in the literature or are relatively obvious — for example, though it is not in the literature, the multivariate conditional exact common information is implemented here.

  • Entropies:
    • Shannon Entropy
    • Renyi Entropy
    • Tsallis Entropy
    • Necessary Conditional Entropy
    • Residual Entropy / Independent Information / Variation of Information
  • Mutual Informations:
    • Co-Information
    • Interaction Information
    • Total Correlation / Multi-Information
    • Dual Total Correlation / Binding Information
    • CAEKL Multivariate Mutual Information
  • Divergences
    • Variational Distance
    • Kullback-Leibler Divergence
    • Cross Entropy
    • Jensen-Shannon Divergence
  • Common Informations:
    • Gacs-Korner Common Information
    • Wyner Common Information
    • Exact Common Information
    • Functional Common Information
    • MSS Common Information
  • Secret Key Agreement bounds:
    • Intrinsic Mutual Information
    • Reduced Intrinsic Mutual Information
    • Minimal Intrinsic Mutual Information
    • Necessary Intrinsic Mutual Information
    • Secrecy Capacity
  • Partial Information Decompositions:
    • \(I_{min}\)
    • \(I_{\wedge}\)
    • \(I_{\downarrow}\)
    • \(I_{proj}\)
    • \(I_{BROJA}\)
    • \(I_{ccs}\)
    • \(I_{\pm}\)
    • \(I_{dep}\)
  • Other measures
    • Channel Capacity
    • Complexity Profile
    • Connected Informations
    • Cumulative Residual Entropy
    • Extropy
    • Information Diagrams
    • Information Trimming
    • Lautum Information
    • LMPR Complexity
    • Marginal Utility of Information
    • Maximum Correlation
    • Hypercontractivity Coefficient
    • Maximum Entropy Distributions
    • Perplexity
    • TSE Complexity

Quickstart

The basic usage of dit corresponds to creating distributions, modifying them if need be, and then computing properties of those distributions. First, we import:

>>> import dit

Suppose we have a really thick coin, one so thick that there is a reasonable chance of it landing on its edge. Here is how we might represent the coin in dit.

>>> d = dit.Distribution(['H', 'T', 'E'], [.4, .4, .2])
>>> print d
Class:          Distribution
Alphabet:       ('E', 'H', 'T') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 1
RV Names:       None

x   p(x)
E   0.2
H   0.4
T   0.4

Calculate the probability of H and also of the combination H or T.

>>> d['H']
0.4
>>> d.event_probability(['H','T'])
0.8

Calculate the Shannon entropy and extropy of the joint distribution.

>>> dit.shannon.entropy(d)
1.5219280948873621
>>> dit.other.extropy(d)
1.1419011889093373

Create a distribution where Z = xor(X, Y).

>>> import dit.example_dists
>>> d = dit.example_dists.Xor()
>>> d.set_rv_names(['X', 'Y', 'Z'])
>>> print d
Class:          Distribution
Alphabet:       ('0', '1') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 3
RV Names:       ('X', 'Y', 'Z')

x     p(x)
000   0.25
011   0.25
101   0.25
110   0.25

Calculate the Shannon mutual informations I[X:Z], I[Y:Z], and I[X,Y:Z].

>>> dit.shannon.mutual_information(d, ['X'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['Y'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['X', 'Y'], ['Z'])
1.0

Calculate the marginal distribution P(X,Z). Then print its probabilities as fractions, showing the mask.

>>> d2 = d.marginal(['X', 'Z'])
>>> print d2.to_string(show_mask=True, exact=True)
Class:          Distribution
Alphabet:       ('0', '1') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 2 (mask: 3)
RV Names:       ('X', 'Z')

x     p(x)
0*0   1/4
0*1   1/4
1*0   1/4
1*1   1/4

Convert the distribution probabilities to log (base 3.5) probabilities, and access its probability mass function.

>>> d2.set_base(3.5)
>>> d2.pmf
array([-1.10658951, -1.10658951, -1.10658951, -1.10658951])

Draw 5 random samples from this distribution.

>>> dit.math.prng.seed(1)
>>> d2.rand(5)
['01', '10', '00', '01', '00']

Enjoy!

 
File Type Py Version Uploaded on Size
dit-1.0.1-cp27-cp27m-macosx_10_12_x86_64.whl (md5) Python Wheel cp27 2018-01-10 461KB
dit-1.0.1-cp36-cp36m-macosx_10_12_x86_64.whl (md5) Python Wheel cp36 2018-01-10 445KB
dit-1.0.1-py2.py3-none-any.whl (md5) Python Wheel py2.py3 2018-01-10 365KB
dit-1.0.1.tar.gz (md5) Source 2018-01-10 239KB