Skip to content
A Markov chain based text generation library and MegaHAL style chatbot
Branch: master
Clone or download
pteichman Remove --use-mirrors on pip command line
The pip Travis CI uses doesn't appear to have this flag anymore.
Latest commit b0dc2a7 Mar 23, 2017
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
cobe Remove duplicated if in brain.py Mar 23, 2017
tests Finish up the cobe 3.x revert Jan 3, 2014
.gitignore Declare cobe 3.x bankruptcy Jan 3, 2014
.travis.yml
COPYING Add copyright information (MIT licensed) Feb 28, 2010
MANIFEST.in Add a MANIFEST.in file Oct 17, 2010
README Update README Jan 2, 2016
develop.sh Declare cobe 3.x bankruptcy Jan 3, 2014
setup.cfg Declare cobe 3.x bankruptcy Jan 3, 2014
setup.py Update irc to 12.1.1, bump cobe version to 2.1.2 Apr 17, 2015

README

COBE stands for Code of Business Ethics. Cobe is a conversation
simulator, originally a database backed port of MegaHAL but a bit
more now.

According to the Nobel Prize committee, "the COBE project can also
be regarded as the starting point for cosmology as a precision
science."

There are a few relevant posts here:
http://teichman.org/blog/2011/09/cobe-2.0.html
http://teichman.org/blog/2011/05/singularity.html
http://teichman.org/blog/2011/02/cobe.html

You can read its release history here:
https://github.com/pteichman/cobe/wiki

Cobe has been inspired by the success of Hailo:
http://blogs.perl.org/users/aevar_arnfjor_bjarmason/2010/01/hailo-a-perl-rewrite-of-megahal.html

Our goals are similar to Hailo: an on-disk data store for lower memory
usage, better support for Unicode, and general stability.

You can read about the original MegaHAL here:
http://megahal.alioth.debian.org/How.html

In short, it uses Markov modeling to generate text responses after
learning from input text.

Cobe creates a directed graph of word n-grams (default n=3) from the
text it learns. When generating a response, it performs random walks
on this graph to create as many candidate replies as it can in half a
second.

As the candidate responses are created, they're run through a scoring
algorithm that identifies which is the best of the group. After the
half second is over, the best candidate is returned as the response.

Cobe installs a command line tool (called cobe) for interacting with a
brain database, though it is also intended to be used as a Python
api. See the documentation in the cobe.brain module for details.

To install from a tarball:

  $ python setup.py install

Or from the Python Package Index:

  $ easy_install pip
  # pip install cobe

Usage:

  $ cobe init
  $ cobe learn <text file>
  $ cobe console
You can’t perform that action at this time.