Skip to content
This is the Reasoning and Learning Lab participation to The Conversational Intelligence Challenge - NIPS 2017 Live Competition (http://convai.io/)
Python Other
  1. Python 98.8%
  2. Other 1.2%
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
data
models
ranker
.gitignore
Dockerfile
README.md
bot.py
bot_final.py
bot_mturk.py
bot_q.py
bot_zmq.py
config.py
hred_trigger_responses.json
leaderboard.28.10.2017.00h19.out
leaderboard.py
model_selection.py
model_selection_final.py
model_selection_mturk.py
model_selection_q.py
model_selection_zmq.py
requirements.txt
start
start_mturk
storage.py
utils.py

README.md

NIPS Challenge Docker container

This repository contains the Dockerfile and setup code to run chat bot in docker instance.

Live Deployment

Telegram bot @conv_test. Make sure to have an username registered in Telegram, and then start conversation with \begin.

File description

  • bot.py : Main entry point of the chat bot, message selection logic can be implemented here.
  • models/ : Folder where model code is stored
  • data/ : Folder where data files are stored
  • config.py : Configuration script, which has the location of data files as well as bot tokens. Replace the bot token with your ones to test.
  • models/wrappper.py - Wrapper function which calls the models. Must implement get_response def.
  • models/setup - shell script to download the models
  • data/setup - shell script to download the data files and saved model files
  • model_selection.py - Selection logic for best answer

Running Docker

  • After installing docker, build the image from this directory using the following command: docker build -t convai .
  • Docker will create a virtual container with all the dependencies needed.
  • Docker will autostart the bot whenever the container is run: docker run convai

Adding your own models

  • In models/setup, add the repository of your model (should be a public repository for now) to clone.
  • In data/setup, add the data location to download your saved model data
  • Change the config.py with the endpoint of the data
  • Create a wrapper in models/wrapper.py for your model
  • Modify the model_selection.py to call your model.

Bugs

Feel free to open an issue or submit a PR.

Authors

Nips ConvAI Challenge McGill RLLDialog Team

You can’t perform that action at this time.