-
Notifications
You must be signed in to change notification settings - Fork 3.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to build Docker image #343
Comments
Build script seems to use the lastest trtis_client (by building), instead of using versioned binaries. Would the following change help? diff --git a/TensorFlow/LanguageModeling/BERT/Dockerfile b/TensorFlow/LanguageModeling/BERT/Dockerfile
index f9044f5..73f1cd9 100644
--- a/TensorFlow/LanguageModeling/BERT/Dockerfile
+++ b/TensorFlow/LanguageModeling/BERT/Dockerfile
@@ -1,10 +1,8 @@
ARG FROM_IMAGE_NAME=nvcr.io/nvidia/tensorflow:19.08-py3
-FROM tensorrtserver_client as trt
-
FROM ${FROM_IMAGE_NAME}
:...skipping...
diff --git a/TensorFlow/LanguageModeling/BERT/Dockerfile b/TensorFlow/LanguageModeling/BERT/Dockerfile
index f9044f5..73f1cd9 100644
--- a/TensorFlow/LanguageModeling/BERT/Dockerfile
+++ b/TensorFlow/LanguageModeling/BERT/Dockerfile
@@ -1,10 +1,8 @@
ARG FROM_IMAGE_NAME=nvcr.io/nvidia/tensorflow:19.08-py3
-FROM tensorrtserver_client as trt
-
FROM ${FROM_IMAGE_NAME}
-RUN apt-get update && apt-get install -y pbzip2 pv bzip2 libcurl3
+RUN apt-get update && apt-get install -y pbzip2 pv bzip2 libcurl4 curl
RUN pip install toposort networkx pytest nltk tqdm html2text progressbar
@@ -16,8 +14,10 @@ RUN git clone https://github.com/titipata/pubmed_parser
RUN pip3 install /workspace/pubmed_parser
-#Copy the perf_client over
-COPY --from=trt /workspace/install/ /workspace/install/
+# Download TRTIS Client
+ARG TRTIS_CLIENTS_URL=https://github.com/NVIDIA/tensorrt-inference-server/releases/download/v1.5.0/v1.5.0_ubuntu1804.clients.tar.gz
+RUN mkdir -p /workspace/install \
+ && curl -L ${TRTIS_CLIENTS_URL} | tar xvz -C /workspace/install
#Install the python wheel with pip
RUN pip install /workspace/install/python/tensorrtserver*.whl
diff --git a/TensorFlow/LanguageModeling/BERT/scripts/docker/build.sh b/TensorFlow/LanguageModeling/BERT/scripts/docker/build.sh
index d14b1b3..4430b47 100755
--- a/TensorFlow/LanguageModeling/BERT/scripts/docker/build.sh
+++ b/TensorFlow/LanguageModeling/BERT/scripts/docker/build.sh
@@ -2,8 +2,4 @@
docker pull nvcr.io/nvidia/tensorrtserver:19.08-py3
-#Will have to update submodule from root
-git submodule update --init --recursive
-cd tensorrt-inference-server && docker build -t tensorrtserver_client -f Dockerfile.client . && cd -
-
docker build . --rm -t bert |
I was finally able to build the Docker image. For future reference, here's the
|
Worked!Thanks! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I'm following this guide. At step 2 I'm required to execute
The build process stops with the following error.
I'm executing this from a standard Ubuntu release:
The text was updated successfully, but these errors were encountered: