Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to build Docker image #343

Closed
mneri opened this issue Dec 12, 2019 · 3 comments
Closed

Unable to build Docker image #343

mneri opened this issue Dec 12, 2019 · 3 comments

Comments

@mneri
Copy link

mneri commented Dec 12, 2019

I'm following this guide. At step 2 I'm required to execute

$ bash scripts/docker/build.sh

The build process stops with the following error.

/home/mneri/DeepLearningExamples/TensorFlow/LanguageModeling/BERT
Sending build context to Docker daemon  8.044MB
Step 1/19 : ARG FROM_IMAGE_NAME=nvcr.io/nvidia/tensorflow:19.08-py3
Step 2/19 : FROM tensorrtserver_client as trt
 ---> 84175a32350e
Step 3/19 : FROM ${FROM_IMAGE_NAME}
 ---> be978d32a5c3
Step 4/19 : RUN apt-get update && apt-get install -y pbzip2 pv bzip2 libcurl3
 ---> Using cache
 ---> f7f50d50b5e1
Step 5/19 : RUN pip install toposort networkx pytest nltk tqdm html2text progressbar
 ---> Using cache
 ---> 30901b40e4f1
Step 6/19 : WORKDIR /workspace
 ---> Using cache
 ---> 6cb5e32e2af4
Step 7/19 : RUN git clone https://github.com/openai/gradient-checkpointing.git
 ---> Using cache
 ---> 3dc52fd16594
Step 8/19 : RUN git clone https://github.com/attardi/wikiextractor.git
 ---> Using cache
 ---> 379e5ceca17a
Step 9/19 : RUN git clone https://github.com/soskek/bookcorpus.git
 ---> Using cache
 ---> 6eb3b8889d2b
Step 10/19 : RUN git clone https://github.com/titipata/pubmed_parser
 ---> Using cache
 ---> 0561e50dcaff
Step 11/19 : RUN pip3 install /workspace/pubmed_parser
 ---> Using cache
 ---> dc5deb716eb6
Step 12/19 : COPY --from=trt /workspace/install/ /workspace/install/
COPY failed: stat /var/lib/docker/overlay2/bc0fd600b2c907c9e02b76d1ca25ba0b96a7d8ce04fd3474698838175d8614a0/merged/workspace/install: no such file or directory

I'm executing this from a standard Ubuntu release:

$ lsb_release -a
Distributor ID: Ubuntu
Description:    Ubuntu 18.04.3 LTS
Release:        18.04
Codename:       bionic
@gigony
Copy link

gigony commented Dec 13, 2019

Build script seems to use the lastest trtis_client (by building), instead of using versioned binaries.

Would the following change help?

diff --git a/TensorFlow/LanguageModeling/BERT/Dockerfile b/TensorFlow/LanguageModeling/BERT/Dockerfile
index f9044f5..73f1cd9 100644
--- a/TensorFlow/LanguageModeling/BERT/Dockerfile
+++ b/TensorFlow/LanguageModeling/BERT/Dockerfile
@@ -1,10 +1,8 @@
 ARG FROM_IMAGE_NAME=nvcr.io/nvidia/tensorflow:19.08-py3
 
-FROM tensorrtserver_client as trt
-
 FROM ${FROM_IMAGE_NAME}
 
:...skipping...
diff --git a/TensorFlow/LanguageModeling/BERT/Dockerfile b/TensorFlow/LanguageModeling/BERT/Dockerfile
index f9044f5..73f1cd9 100644
--- a/TensorFlow/LanguageModeling/BERT/Dockerfile
+++ b/TensorFlow/LanguageModeling/BERT/Dockerfile
@@ -1,10 +1,8 @@
 ARG FROM_IMAGE_NAME=nvcr.io/nvidia/tensorflow:19.08-py3
 
-FROM tensorrtserver_client as trt
-
 FROM ${FROM_IMAGE_NAME}
 
-RUN apt-get update && apt-get install -y pbzip2 pv bzip2 libcurl3
+RUN apt-get update && apt-get install -y pbzip2 pv bzip2 libcurl4 curl
 
 RUN pip install toposort networkx pytest nltk tqdm html2text progressbar
 
@@ -16,8 +14,10 @@ RUN git clone https://github.com/titipata/pubmed_parser
 
 RUN pip3 install /workspace/pubmed_parser
 
-#Copy the perf_client over
-COPY --from=trt /workspace/install/ /workspace/install/
+# Download TRTIS Client
+ARG TRTIS_CLIENTS_URL=https://github.com/NVIDIA/tensorrt-inference-server/releases/download/v1.5.0/v1.5.0_ubuntu1804.clients.tar.gz
+RUN mkdir -p /workspace/install \
+    && curl -L ${TRTIS_CLIENTS_URL} | tar xvz -C /workspace/install
 
 #Install the python wheel with pip
 RUN pip install /workspace/install/python/tensorrtserver*.whl
diff --git a/TensorFlow/LanguageModeling/BERT/scripts/docker/build.sh b/TensorFlow/LanguageModeling/BERT/scripts/docker/build.sh
index d14b1b3..4430b47 100755
--- a/TensorFlow/LanguageModeling/BERT/scripts/docker/build.sh
+++ b/TensorFlow/LanguageModeling/BERT/scripts/docker/build.sh
@@ -2,8 +2,4 @@
 
 docker pull nvcr.io/nvidia/tensorrtserver:19.08-py3
 
-#Will have to update submodule from root
-git submodule update --init --recursive
-cd tensorrt-inference-server && docker build -t tensorrtserver_client -f Dockerfile.client . && cd -
-
 docker build . --rm -t bert

@mneri
Copy link
Author

mneri commented Dec 16, 2019

I was finally able to build the Docker image.

For future reference, here's the Dockerfile and the scripts/docker/build.sh:

ARG FROM_IMAGE_NAME=nvcr.io/nvidia/tensorflow:19.08-py3

#FROM tensorrtserver_client as trt

FROM ${FROM_IMAGE_NAME}

#RUN apt-get update && apt-get install -y pbzip2 pv bzip2 libcurl3
RUN apt-get update && apt-get install -y pbzip2 pv bzip2 libcurl4 curl

RUN pip install toposort networkx pytest nltk tqdm html2text progressbar

WORKDIR /workspace
RUN git clone https://github.com/openai/gradient-checkpointing.git
RUN git clone https://github.com/attardi/wikiextractor.git
RUN git clone https://github.com/soskek/bookcorpus.git
RUN git clone https://github.com/titipata/pubmed_parser

RUN pip3 install /workspace/pubmed_parser

#Copy the perf_client over
#COPY --from=trt /workspace/install/ /workspace/install/
ARG TRTIS_CLIENTS_URL=https://github.com/NVIDIA/tensorrt-inference-server/releases/download/v1.5.0/v1.5.0_ubuntu1804.clients.tar.gz
RUN mkdir -p /workspace/install \
    && curl -L ${TRTIS_CLIENTS_URL} | tar xvz -C /workspace/install


#Install the python wheel with pip
RUN pip install /workspace/install/python/tensorrtserver*.whl

WORKDIR /workspace/bert
COPY . .

ENV PYTHONPATH /workspace/bert
ENV BERT_PREP_WORKING_DIR /workspace/bert/data
ENV PATH //workspace/install/bin:${PATH}
ENV LD_LIBRARY_PATH /workspace/install/lib:${LD_LIBRARY_PATH}
#!/bin/bash

docker pull nvcr.io/nvidia/tensorrtserver:19.08-py3

#Will have to update submodule from root
#git submodule update --init --recursive
#cd tensorrt-inference-server && docker build -t tensorrtserver_client -f Dockerfile.client . && cd -

docker build . --rm -t bert

@arnepeine
Copy link

Worked!Thanks!

@mneri mneri closed this as completed Dec 17, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants