You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For GPU support in our tensorflow notebook image, we currently use the development base image for cuda.
Switching to the runtime image should give us much smaller image sizes with the same functionality.
From the dockerhub page:
runtime: extends the base image by adding all the shared libraries from the CUDA toolkit.
Use this image if you have a pre-built application using multiple CUDA libraries.
devel: extends the runtime image by adding the compiler toolchain, the debugging tools, the headers and the static libraries.
For GPU support in our tensorflow notebook image, we currently use the development base image for cuda.
Switching to the runtime image should give us much smaller image sizes with the same functionality.
From the dockerhub page:
https://hub.docker.com/r/nvidia/cuda/
Sizes of base iamge
9.0-cudnn7-devel-ubuntu16.04: 2.584 GB
9.0-cudnn7-runtime-ubuntu16.04: 1.148 GB
Savings: 1.436 GB
The text was updated successfully, but these errors were encountered: