Skip to content

NVIDIA-AI-IOT/JEP_ChatBot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ChatBot: Use TensorRT to Inference a TensorFlow Model

Demonstrate how to use TensorRT to accelerate TensorFlow inference on Jetson.




You can learn following things with this sample:

  1. Convert an LSTM TensorFlow model into UFF format.
  2. Run a UFF model on Jetson with Python (includes a C++ to Python TensorRT wrapper).

We also include chatbot training and x86 chatbot source(pure python) for user reference.

Environment

Device

  1. LINK Jetson TX2
  2. LINK JetPack 3.2

Please flash your device with JetPack3.2 first.
sudo apt-get install python-pip
sudo apt-get install swig
sudo pip install numpy

Host

  1. LINK Ubuntu16.04
  2. LINK CUDA Toolkit 9
  3. LINK TensorRT 3

Please get CUDA 9 and TensorRT 3 installed first.
sudo apt-get install python-pip
sudo pip install tensorflow
sudo apt-get install swig
sudo pip install numpy


Quick Try

We also attach a ChatBot model for user reference.

Notes

  1. Don't blame him. He learns to talk with a small database(Cornell).
  2. This model is trained on ONE Tesla P100 GPU and takes around 15 days to finish.
  3. You can have the training job done much faster with a DGX-1 server.
  4. Word vector is 4096. A word not in our vocabulary is replaced by '_'.
  5. Our ChatBot is sensitive to the symbol. Please don't forget your symbol at the end of the sentence.
  6. Our model is GAN.

Execution steps

Host: convert TF model into UFF format

$ git clone https://github.com/AastaNV/ChatBot.git
$ cd $CHATBOT_ROOT
$ python src/tf_to_uff/tf_to_trt.py model/ID210_649999 model/ID210_649999.uff
$ scp model/ID210_649999.uff nvidia@[device IP]:$CHATBOT_ROOT/model/

Device: create TensorRT engine with the converted UFF file

$ 
$ git clone https://github.com/AastaNV/ChatBot.git
$ wget https://raw.githubusercontent.com/numpy/numpy/master/tools/swig/numpy.i -P $CHATBOT_ROOT/src/
$ cd $CHATBOT_ROOT
$ make
$ python chatbot.py model/ID210.pickle model/ID210_649999.uff 

Examples

#c5f015 Hi, ChatBot!
#1589F0 hello .

#c5f015 How are you today?
#1589F0 fine .

#c5f015 How about to go out for dinner?
#1589F0 okay .

#c5f015 What do you want to eat?
#1589F0 i don't know .

#c5f015 Steak?
#1589F0 well , i don't think so .

#c5f015 Sandwich?
#1589F0 what ?

#c5f015 The coffee shop on the corner.
#1589F0 yes .

#c5f015 Great!
#1589F0 i was just wondering .

#c5f015 Wonder what?
#1589F0 i don't know .

#c5f015 Okay. YOu are a artist, right?
#1589F0 oh , no . . .

#c5f015 Anyway, see you tonight.
#1589F0 what ?

#c5f015 See you...
#1589F0 i know .

#c5f015 Okay... bye bye.
#1589F0 goodnight .



Export TensorFlow Model to UFF Format

#c5f015 Please run this step on host #c5f015

Prepare following files from TensorFlow:

  • [myModel].data-00000-of-00001
  • [myModel].index
  • [myModel].meta
$ cd $CHATBOT_ROOT/src/tf_to_uff
$ python tf_to_trt.py [myModel] [myModel].uff


Run UFF Model on Jetson

#c5f015 Please run this step on device #c5f015

Prepare following files in model directory:

  • [database].pickle
  • [myModel].uff
$ wget https://raw.githubusercontent.com/numpy/numpy/master/tools/swig/numpy.i -P $CHATBOT_ROOT/src/
$ cd $CHATBOT_ROOT/
$ make
$ python chatbot.py model/[database].pickle model/[myModel].uff 


Training

#c5f015 Please run this step on host #c5f015

We also share our training code for user reference.



Prepare following files from Cornell Movie Dialogs Corpus:

  • movie_conversations.txt
  • movie_lines.txt
$ cd $CHATBOT_ROOT/src/training
$ python parser_for_cornell.py
$ python main.py 


Run ChatBot on X86-based Linux Machine

#c5f015 Please run this step on host #c5f015

We also provide source to run TensorRT on x86-based Machine
Prepare following files:

  • [myModel].data-00000-of-00001
  • [myModel].index
  • [myModel].meta
  • [database].pickle
$ cd $CHATBOT_ROOT/src/tf_to_uff
$ python tf_to_trt.py [myModel] [myModel].uff
$ python chatbot.py [database].pickle [myModel].uff 


About

ChatBot: sample for TensorRT inference with a TF model

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published