Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple GPUs, for multiple VNC desktop #7

Closed
LiangHuangBC opened this issue Mar 2, 2021 · 5 comments
Closed

Multiple GPUs, for multiple VNC desktop #7

LiangHuangBC opened this issue Mar 2, 2021 · 5 comments

Comments

@LiangHuangBC
Copy link

LiangHuangBC commented Mar 2, 2021

We have one Gpu server with 4 Gpus. There are 4 developers want to use these Gpus, one for each. But we can only open one vnc desktop at the same time, when we open second, the first one will go black. Command I tried:
docker run --name glx1 -d --gpus "device=1" --privileged -it -e SIZEW=1920 -e SIZEH=1080 -e SHARED=TRUE -e VNCPASS=vncpasswd -p 5902:5901 ehfd/nvidia-glx-desktop:latest
By this command, we can run program on second gpu, but can not work together with the first container running on --gpus "device=0". The first container will go black when second container start, even they are not on same gpu.

Also tried command without privileged:
docker run --name glx1 -d --gpus "device=1" --device=/dev/tty1:rw -it -e SIZEW=1920 -e SIZEH=1080 -e SHARED=TRUE -e VNCPASS=vncpasswd -p 5902:5901 ehfd/nvidia-glx-desktop:latest
By this command, x11vnc can not start up. Error is could't open display.

And tried command in README, tried to open two vnc on same gpu 0, no luck.

And btw, there is small issue in bootstrap.sh, when docker restart your container, bootstrap will stuck on re-install driver. We fixed it by commenting it out.

@ehfd
Copy link
Member

ehfd commented Mar 6, 2021

@LiangHuangBC I suspect the error is in privileged access so both containers can see each other. Could you post the contents of /etc/X11/Xorg.conf on both containers and also echo $NVIDIA_VISIBLE_DEVICES too? Else, you could test passing the devices HERE and disable privileged. Whether that works or not is big help to the repo because I have not tested in pure docker.

And btw, there is small issue in bootstrap.sh, when docker restart your container, bootstrap will stuck on re-install driver. We fixed it by commenting it out.

This container is only tested in Kubernetes, where when a pod restarts, the filesystem is reset. I can fix that when I have time.

Note this repo will be rebuilt when NVIDIA enables XWayland support in 470.xx drivers to use the Wayland workflow instead of Xorg.

@ehfd ehfd self-assigned this Mar 6, 2021
@ehfd
Copy link
Member

ehfd commented Mar 7, 2021

Fixed code to work in privileged mode. Please test and tell me. Not available without privileged.

bootstrap will stuck on re-install driver

Please feedback on the fix for this too.

However, ehfd/docker-nvidia-egl-desktop is guaranteed to work on multiple desktops over ehfd/docker-nvidia-glx-desktop (which uses quite hacky configurations to use Xorg inside containers) and is the recommended container to use. But ehfd/docker-nvidia-egl-desktop doesn't support Vulkan. If you need Vulkan use https://github.com/mviereck/x11docker, but you need to use docker on one single node with full root control and cannot be used on other container orchestration platforms. The ehfd/docker-nvidia-glx-desktop container is a limited functionality container designed for multi-node clusters with container orchestration or when root access is not available until Xwayland support for NVIDIA proprietary drivers are available.

@ehfd ehfd closed this as completed Mar 24, 2021
@ehfd ehfd removed their assignment Jul 11, 2021
@ehfd
Copy link
Member

ehfd commented Jul 13, 2021

In progress of fixing this issue.

@ehfd ehfd reopened this Jul 13, 2021
@ehfd
Copy link
Member

ehfd commented Jul 13, 2021

Fixed in commit 1fbae34

Please consult the new instructions. Privileged mode is no longer supported in favor of unprivileged containers.

@ehfd ehfd closed this as completed Jul 13, 2021
@ehfd
Copy link
Member

ehfd commented Aug 24, 2022

Added in Documentation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants