You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[Bug]: could not find executable path vllm-openai in any version of the vllm/vllm-openai:latestimage docker run -it --entrypoint /bin/bash vllm/vllm-openai:latest find / -name vllm-openai please fix this or how to get this path``` getting below error while executing it docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "vllm-openai": executable file not found in $PATH: unknown.
</details>
### 🐛 Describe the bug
[Bug]: could not find executable path vllm-openai in any version of the vllm/vllm-openai:latestimage docker run -it --entrypoint /bin/bash vllm/vllm-openai:latest find / -name vllm-openai how to get this path/ please fix it, getting below error while executing it docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "vllm-openai": executable file not found in $PATH: unknown.
### Before submitting a new issue...
- [x] Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the [documentation page](https://docs.vllm.ai/en/latest/), which can answer lots of frequently asked questions.
The text was updated successfully, but these errors were encountered:
Run the image with docker run -it --rm -p 8666:8000 --ipc=host --gpus "device=0" --name vllm318b vllm/vllm318b:1
gives docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "vllm-openai": executable file not found in $PATH: unknown
Run orginal image to see PATH not found docker run -it --entrypoint /bin/bash vllm/vllm-openai:latest find / -name vllm-openai
If you look at the Dockerfiles in the vLLM repo, the ENTRYPOINT command should be ["python3", "-m", "vllm.entrypoints.openai.api_server"] plus extra arguments.
Your current environment
The output of `python collect_env.py`
The text was updated successfully, but these errors were encountered: