Skip to content
This repository has been archived by the owner on Mar 31, 2021. It is now read-only.

Add docker file and the necessary documentation #6

Merged
merged 2 commits into from Dec 27, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
54 changes: 54 additions & 0 deletions docker/Dockerfile
@@ -0,0 +1,54 @@
FROM ubuntu:16.04
USER root
LABEL maintainer="github.com/fai555"

RUN apt-get update && \
apt-get install -y software-properties-common && \
add-apt-repository ppa:deadsnakes/ppa && \
apt-get update

RUN apt-get install -y build-essential python3.6 python3.6-dev python3-pip python3.6-venv
RUN python3.6 -m pip install pip --upgrade

# Pick up some TF dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential \
curl \
libfreetype6-dev \
libpng12-dev \
libzmq3-dev \
pkg-config \
rsync \
git \
software-properties-common \
unzip \
wget \
&& \
apt-get clean && \
rm -rf /var/lib/apt/lists/*


RUN pip3 --no-cache-dir install \
tensorflow==1.13.2

RUN pip3 --no-cache-dir install \
Cython \
contextlib2 \
jupyter \
matplotlib \
pillow \
lxml \
wheel \
pandas

RUN pip3 install "git+https://github.com/philferriere/cocoapi.git#egg=pycocotools&subdirectory=PythonAPI"
WORKDIR /app
RUN git clone https://github.com/fai555/tensorflow-serving_sidecar.git
WORKDIR /app/tensorflow-serving_sidecar
RUN pip3 install -r requirements.txt
RUN wget -O protobuf.zip https://github.com/google/protobuf/releases/download/v3.9.0/protoc-3.9.0-linux-x86_64.zip && \
unzip protobuf.zip && \
./bin/protoc object_detection/protos/*.proto --python_out=.

ENV PYTHONPATH "/app/models/research:/app/models/research/slim:${PYTHONPATH}"
ENTRYPOINT ["python3.6"]
38 changes: 38 additions & 0 deletions docs/docker_prediction_request_client.md
@@ -0,0 +1,38 @@
# Build Docker Images

You can build your own docker image using the Dockerfile in the docker direcoty.
```
docker build -t tensorflow-serving-sidecar-client:latest .
fpaupier marked this conversation as resolved.
Show resolved Hide resolved
```

# Public docker image

A public docker image in Google Cloud Container Registry is availabe if you don't want to build the image yourself. You can just pull the image and use.
```
asia.gcr.io/im-mlpipeline/tensorflow-serving-sidecar-client:latest
```





# Sending Prediction Requests
You can use the following command to send prediction requests.

```
# absolute path to the directory where label_map.pbtxt and input image files are. It must be an absolute path. Docker volume mounting doesn't work with relative path. All other paths in this scripts will work with relative path.
export VOLUME_PATH=""
export SERVER_URL="http://34.73.137.228:8501/v1/models/faster_rcnn_resnet:predict"
# relative path to the test image. i.e. config/image1.jpg. The path is relative to VOLUME_PATH
export IMAGE_PATH=""
# relative path to the output json. i.e. config/out_image1.json. The path is relative to VOLUME_PATH
export OUTPUT_JSON=""
# relative path to the label_map.pbtxt. i.e. config/label_map.pbtxt. The path is relative to VOLUME_PATH
export LABEL_MAP=""
# Specify True if you want to save the output image.
export SAVE_OUTPUT_IMAGE=True
# You can use the publicly available docker image or use your own image
export DOCKER_IMAGE_NAME="asia.gcr.io/im-mlpipeline/tensorflow-serving-sidecar-client:latest"

docker run -it -v ${VOLUME_PATH}:/app/tensorflow-serving_sidecar/config -t ${DOCKER_IMAGE_NAME} client.py --server_url=${SERVER_URL} --image_path=${IMAGE_PATH} --output_json=${OUTPUT_JSON} --save_output_image=${SAVE_OUTPUT_IMAGE} --label_map=${LABEL_MAP}
```