Markov Chains made easy
Project description
Very simple an easy to use Markov Chain utility for Python:
#!/usr/bin/env python from pyMarkov import markov text = "This is a random bunch of text" markov_dict = markov.train([text], 2) # 2 is the ply print markov.generate(markov_dict, 10, 2) # 2 is the ply, 10 is the length >>> 'random bunch of text'
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
PyMarkov-0.1.0.tar.gz
(3.1 kB
view hashes)