Skip to content

Thor77/MarkovPy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

99 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MarkovPy Build Status Coverage Status Documentation Status

A simple Markovchain-Implementation written in Python

Installation

  • pip3 install markovpy
  • Clone this repo git clone https://github.com/Thor77/MarkovPy and run python3 setup.py install

Usage

Initialize a new MarkovPy-Instance with an initialized store:

from markov.stores import Store
m = markov.MarkovPy(store=Store())

Now give it some data to learn from:

m.learn('hey how are you?')
m.learn('im fine.')
m.learn('great, see you')

And finally let it generate a reply:

m.reply('im')
# im fine.

Available stores

Pickle

markov.stores.Pickle

Using an in-memory-dict and pickle to persist it between sessions.

Redis (requires redis)

markov.stores.Redis

Using a redis-database.

About

A simple Markovchain-Implementation written in Python

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages