# dit 1.0.0.dev27

Python package for information theory.

dit is a Python package for information theory.

Documentation:
http://docs.dit.io
https://pypi.org/project/dit/
Dependencies:
• Python 2.7, 3.3, 3.4, 3.5, or 3.6
• boltons
• contextlib2
• debtcollector
• networkx
• numpy
• prettytable
• scipy
• six
Optional Dependencies:
• colorama
• cython
• numdifftools
• scikit-learn
Note:
The cython extensions are currently not supported on windows. Please install using the --nocython option.
Install:

The easiest way to install is:

```pip install dit
```

Alternatively, you can clone this repository, move into the newly created dit directory, and then install the package:

```git clone https://github.com/dit/dit.git
cd dit
pip install .
```
Mailing list:
None
Code and bug tracker:
https://github.com/dit/dit
BSD 2-Clause, see LICENSE.txt for details.

## Quickstart

The basic usage of dit corresponds to creating distributions, modifying them if need be, and then computing properties of those distributions. First, we import:

```>>> import dit
```

Suppose we have a really thick coin, one so thick that there is a reasonable chance of it landing on its edge. Here is how we might represent the coin in dit.

```>>> d = dit.Distribution(['H', 'T', 'E'], [.4, .4, .2])
>>> print d
Class:          Distribution
Alphabet:       ('E', 'H', 'T') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 1
RV Names:       None

x   p(x)
E   0.2
H   0.4
T   0.4
```

Calculate the probability of H and also of the combination H or T.

```>>> d['H']
0.4
>>> d.event_probability(['H','T'])
0.8
```

Calculate the Shannon entropy and extropy of the joint distribution.

```>>> dit.shannon.entropy(d)
1.5219280948873621
>>> dit.other.extropy(d)
1.1419011889093373
```

Create a distribution where Z = xor(X, Y).

```>>> import dit.example_dists
>>> d = dit.example_dists.Xor()
>>> d.set_rv_names(['X', 'Y', 'Z'])
>>> print d
Class:          Distribution
Alphabet:       ('0', '1') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 3
RV Names:       ('X', 'Y', 'Z')

x     p(x)
000   0.25
011   0.25
101   0.25
110   0.25
```

Calculate the Shannon mutual informations I[X:Z], I[Y:Z], and I[X,Y:Z].

```>>> dit.shannon.mutual_information(d, ['X'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['Y'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['X', 'Y'], ['Z'])
1.0
```

Calculate the marginal distribution P(X,Z). Then print its probabilities as fractions, showing the mask.

```>>> d2 = d.marginal(['X', 'Z'])
Class:          Distribution
Alphabet:       ('0', '1') for all rvs
Base:           linear
Outcome Class:  str
RV Names:       ('X', 'Z')

x     p(x)
0*0   1/4
0*1   1/4
1*0   1/4
1*1   1/4
```

Convert the distribution probabilities to log (base 3.5) probabilities, and access its probability mass function.

```>>> d2.set_base(3.5)
>>> d2.pmf
array([-1.10658951, -1.10658951, -1.10658951, -1.10658951])
```

Draw 5 random samples from this distribution.

```>>> dit.math.prng.seed(1)
>>> d2.rand(5)
['01', '10', '00', '01', '00']
```

Enjoy!

File Type Py Version Uploaded on Size
Python Wheel cp27 2017-11-10 410KB
Python Wheel cp36 2017-11-10 394KB
Python Wheel py2.py3 2017-11-10 314KB
Source 2017-11-10 229KB
• Author: Humans