Use Tensorflow Serving to serve a TF2.0 NLP model
- Clone the repo
git clone git@github.com:dujm/kaggle_quora.git
# Remove my git directory
cd kaggle_quora
rm -r .git/
- Install packages
pip install -r requirements.txt
- Download the Kaggle Quora dataset
-
Or install Kaggle API and run:
bash src/data/download-dataset.sh unzip src/data/embeddings.zip
- Build tensorflow serving locally
bash src/models/00build-tfserve.sh
- Train a test NLP model and save the model as a Tensorflow saved model
python src/models/01train-saved-model.py
- Serve the model locally using Tensorflow Serving
bash src/models/02tf-serve-model.sh
TBC
├── LICENSE
├── README.md
├── src
├── test_environment.py
│ ├── data
│ │ ├── download-dataset.sh
│ │ ├── embeddings
│ │ │ ├── embeddings_index.npy
│ │ │ ├── glove.840B.300d
│ │ │ │ └── glove.840B.300d.txt
│ │ │ ├── Other two embedding files (Not used here)
│ │ ├── input
│ │ │ ├── test.csv
│ │ │ └── train.csv
│ ├── models
│ │ ├── 00build-tfserve.sh
│ │ ├── 01train-saved-model.py
│ │ ├── 02tf-serve-model.sh
│ │ ├── sincere
│ │ │ └── 1
│ │ │ ├── assets
│ │ │ ├── saved_model.pb
│ │ │ └── variables
│ │ │ ├── variables.data-00000-of-00001
│ │ │ └── variables.index
│ │ └── utils.py
└──test_environment.py
The information about code style in python is documented in this two links python-developer-guide and python-style-guide.
Python 3
- Quora Insincere Questions Classification
- Using the SavedModel format
- tf.saved_model.save
- TensorFlow Serving
The MIT License (MIT)