Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: CUDA enabled docker container fails to launch #7822

Closed
mblunt opened this issue Jun 7, 2024 · 3 comments
Closed

Bug: CUDA enabled docker container fails to launch #7822

mblunt opened this issue Jun 7, 2024 · 3 comments
Labels
bug-unconfirmed critical severity Used to report critical severity bugs in llama.cpp (e.g. Crashing, Corrupted, Dataloss)

Comments

@mblunt
Copy link

mblunt commented Jun 7, 2024

What happened?

Docker container is missing a shared object, causing failures on launch.

Modify as needed, taken from docs:
docker run -p 8080:8080 -v /path/to/models:/models --gpus all ghcr.io/ggerganov/llama.cpp:server-cuda -m models/7B/ggml-model.gguf -c 512 --host 0.0.0.0 --port 8080 --n-gpu-layers 99

Name and Version

Version unknown, container will not launch.

What operating system are you seeing the problem on?

Ubuntu Server, amd64 architecture.

Relevant log output

/server: error while loading shared libraries: libcuda.so.1: cannot open shared object file: No such file or directory```
@mblunt mblunt added bug-unconfirmed critical severity Used to report critical severity bugs in llama.cpp (e.g. Crashing, Corrupted, Dataloss) labels Jun 7, 2024
@slaren
Copy link
Collaborator

slaren commented Jun 8, 2024

libcuda.so is part of the NVIDIA driver, this is likely caused by a configuration issue.

@wilksu
Copy link

wilksu commented Jun 15, 2024

-e LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/usr/lib/wsl/lib
add the params,then it work

@toanju
Copy link

toanju commented Jun 23, 2024

-e LD_LIBRARY_PATH=/usr/local/cuda/compat/ is setting the correct location

@mblunt mblunt closed this as completed Jun 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-unconfirmed critical severity Used to report critical severity bugs in llama.cpp (e.g. Crashing, Corrupted, Dataloss)
Projects
None yet
Development

No branches or pull requests

4 participants