Skip to content

Latest commit

 

History

History
35 lines (25 loc) · 1.85 KB

README.md

File metadata and controls

35 lines (25 loc) · 1.85 KB

Docker Resources

DJL provides docker files that you can use to setup containers with the appropriate environment for certain platforms.

We recommend setting up a docker container with the provided Dockerfile when developing for the following platforms and/or engines.

Windows

You can use the docker file provided by us. Please note that this docker will only work with Windows server 2019 by default. If you want it to work with other versions of Windows, you need to pass the version as an argument as follows:

docker build --build-arg version=<YOUR_VERSION>

TensorRT

You can use the docker file provided by us. This docker file is a modification of the one provided by NVIDIA in TensorRT to include JDK11. By default this sets up a container using Ubuntu 18.04 and CUDA 11.6.2. You can build the container with other versions as follows, but keep in mind the TensorRT software requirements outlined here:

docker build --build-arg OS_VERSION=<YOUR_VERSION> --build-arg CUDA_VERSION=<YOUR_VERSION>

To run the container, we recommend using nvidia-docker run ... to ensure cuda driver and runtime are compatible.

We recommend that you follow the setup steps in the TensorRT guide if you need access to the full suite of tools TensorRT provides, such as trtexec which can convert onnx models to uff tensorrt models. When following that guide, make sure to use the DJL provided docker file to enable JDK11 in the docker container.