HTTPS clone URL
Subversion checkout URL
A Markov chain based text generation library and MegaHAL style chatbot
bayes cached-counts cobe-2 cobe-3x cobe-3 db-random gh-pages instaweb learning-fast master snowballs very-messy-3.x-sketches
Nothing to show
Latest commit 63102f0 pteichman Remove the Travis build for Python 2.6 …
This is no longer supported by the irc library cobe uses.
COBE stands for Code of Business Ethics. Cobe is a conversation simulator, originally a database backed port of MegaHAL but a bit more now. There are a few relevant posts here: http://teichman.org/blog/2011/09/cobe-2.0.html http://teichman.org/blog/2011/05/singularity.html http://teichman.org/blog/2011/02/cobe.html You can read its release history here: https://github.com/pteichman/cobe/wiki Cobe has been inspired by the success of Hailo: http://blogs.perl.org/users/aevar_arnfjor_bjarmason/2010/01/hailo-a-perl-rewrite-of-megahal.html Our goals are similar to Hailo: an on-disk data store for lower memory usage, better support for Unicode, and general stability. You can read about the original MegaHAL here: http://megahal.alioth.debian.org/How.html In short, it uses Markov modeling to generate text responses after learning from input text. Cobe creates a directed graph of word n-grams (default n=3) from the text it learns. When generating a response, it performs random walks on this graph to create as many candidate replies as it can in half a second. As the candidate responses are created, they're run through a scoring algorithm that identifies which is the best of the group. After the half second is over, the best candidate is returned as the response. Cobe installs a command line tool (called cobe) for interacting with a brain database, though it is also intended to be used as a Python api. See the documentation in the cobe.brain module for details. To install from a tarball: $ python setup.py install Or from the Python Package Index: $ easy_install pip # pip install cobe Usage: $ cobe init $ cobe learn <text file> $ cobe console