Skip to content
Markov chains in pure python
Branch: master
Clone or download
Latest commit 7b555f2 Jul 3, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
images logo Jul 1, 2019
marc remove counter Jul 3, 2019
tests remove counter Jul 3, 2019
.gitignore Initial commit Apr 30, 2019
.travis.yml first travis deploy attempt May 23, 2019
LICENSE Initial commit Apr 30, 2019
README.md remove counter Jul 3, 2019
setup.py remove counter Jul 3, 2019

README.md

marc

MIT Travis PyPI Downloads

About

marc is a small, but flexible, library that implements Markov chains in pure python.

Usage

from marc import MarkovChain

chain = [
    'Rock', 'Rock', 'Paper', 'Rock', 'Scissors',
    'Paper', 'Paper', 'Paper', 'Scissors', 'Rock',
    'Scissors', 'Scissors', 'Paper', 'Rock', 'Rock',
    'Rock', 'Rock', 'Paper', 'Rock', 'Rock'
]

mc = MarkovChain(chain)

mc.next_state('Rock')
# 'Rock'

mc.generate_states('Paper', n=5)
# ['Scissors', 'Paper', 'Rock', 'Paper', 'Scissors']

mc.next_state('Scissors')
# 'Paper'

Install

pip install marc
You can’t perform that action at this time.