tensorflow seq2seq chatbot
Note: the repository is not maintained. Feel free to PM me if you'd like to take up the maintainance.
The current results are pretty lousy:
hello baby - hello how old are you ? - twenty . i am lonely - i am not nice - you ' re not going to be okay . so rude - i ' m sorry .
- the answers are hand-picked (it looks cooler that way)
- chatbot has no power to follow the conversation line so far; in the example above it's a just a coincidence (hand-picked one)
Everyone is welcome to investigate the code and suggest the improvements.
- realise how to diversify chatbot answers (currently the most probable one is picked and it's dull)
Curtesy of this article.
git clone email@example.com:nicolas-ivanov/tf_seq2seq_chatbot.git cd tf_seq2seq_chatbot bash setup.sh
Train a seq2seq model on a small (17 MB) corpus of movie subtitles:
(this command will run the training on a CPU... GPU instructions are coming)
Test trained trained model on a set of common questions:
Chat with trained model in console:
All configuration params are stored at
If you are lucky to have a proper gpu configuration for tensorflow already, this should do the job:
Otherwise you may need to build tensorflow from source and run the code as follows:
cd tensorflow # cd to the tensorflow source folder cp -r ~/tf_seq2seq_chatbot ./ # copy project's code to tensorflow root bazel build -c opt --config=cuda tf_seq2seq_chatbot:train # build with gpu-enable option ./bazel-bin/tf_seq2seq_chatbot/train # run the built code