ChatBot: Use TensorRT to Inference a TensorFlow Model
- Convert an LSTM TensorFlow model into UFF format.
- Run a UFF model on Jetson with Python (includes a C++ to Python TensorRT wrapper).
We also include chatbot training and x86 chatbot source(pure python) for user reference.
Please flash your device with JetPack3.2 first.
sudo apt-get install python-pip sudo apt-get install swig sudo pip install numpy
Please get CUDA 9 and TensorRT 3 installed first.
sudo apt-get install python-pip sudo pip install tensorflow sudo apt-get install swig sudo pip install numpy
We also attach a ChatBot model for user reference.
- Don't blame him. He learns to talk with a small database(Cornell).
- This model is trained on ONE Tesla P100 GPU and takes around 15 days to finish.
- You can have the training job done much faster with a DGX-1 server.
- Word vector is 4096. A word not in our vocabulary is replaced by '_'.
- Our ChatBot is sensitive to the symbol. Please don't forget your symbol at the end of the sentence.
- Our model is GAN.
Host: convert TF model into UFF format
$ git clone https://github.com/AastaNV/ChatBot.git $ cd $CHATBOT_ROOT $ python src/tf_to_uff/tf_to_trt.py model/ID210_649999 model/ID210_649999.uff $ scp model/ID210_649999.uff nvidia@[device IP]:$CHATBOT_ROOT/model/
Device: create TensorRT engine with the converted UFF file
$ $ git clone https://github.com/AastaNV/ChatBot.git $ wget https://raw.githubusercontent.com/numpy/numpy/master/tools/swig/numpy.i -P $CHATBOT_ROOT/src/ $ cd $CHATBOT_ROOT $ make $ python chatbot.py model/ID210.pickle model/ID210_649999.uff
How are you today?
How about to go out for dinner?
What do you want to eat?
i don't know .
well , i don't think so .
The coffee shop on the corner.
i was just wondering .
i don't know .
Okay. YOu are a artist, right?
oh , no . . .
Anyway, see you tonight.
i know .
Okay... bye bye.
Export TensorFlow Model to UFF Format
$ cd $CHATBOT_ROOT/src/tf_to_uff $ python tf_to_trt.py [myModel] [myModel].uff
Run UFF Model on Jetson
$ wget https://raw.githubusercontent.com/numpy/numpy/master/tools/swig/numpy.i -P $CHATBOT_ROOT/src/ $ cd $CHATBOT_ROOT/ $ make $ python chatbot.py model/[database].pickle model/[myModel].uff
Please run this step on host
We also share our training code for user reference.
Prepare following files from Cornell Movie Dialogs Corpus:
$ cd $CHATBOT_ROOT/src/training $ python parser_for_cornell.py $ python main.py
Run ChatBot on X86-based Linux Machine
$ cd $CHATBOT_ROOT/src/tf_to_uff $ python tf_to_trt.py [myModel] [myModel].uff $ python chatbot.py [database].pickle [myModel].uff