Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: could not find executable path vllm-openai in any version of the vllm/vllm-openai:latestimage #15007

Closed
Chennakesavulu5 opened this issue Mar 18, 2025 · 4 comments
Labels
bug Something isn't working

Comments

@Chennakesavulu5
Copy link

Chennakesavulu5 commented Mar 18, 2025

Your current environment

The output of `python collect_env.py`
[Bug]: could not find executable path vllm-openai in any version of the vllm/vllm-openai:latestimage docker run -it --entrypoint /bin/bash vllm/vllm-openai:latest find / -name vllm-openai  please fix this or how to get this path```  getting below error while executing it docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "vllm-openai": executable file not found in $PATH: unknown.

</details>


### 🐛 Describe the bug

[Bug]: could not find executable path vllm-openai in any version of the vllm/vllm-openai:latestimage docker run -it --entrypoint /bin/bash vllm/vllm-openai:latest find / -name vllm-openai how to get this path/ please fix it, getting below error while executing it docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "vllm-openai": executable file not found in $PATH: unknown.

### Before submitting a new issue...

- [x] Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the [documentation page](https://docs.vllm.ai/en/latest/), which can answer lots of frequently asked questions.
@Chennakesavulu5 Chennakesavulu5 added the bug Something isn't working label Mar 18, 2025
@yankay
Copy link

yankay commented Mar 18, 2025

HI @Chennakesavulu5

Would you please provide the reproduce step of the issue :-)

@Chennakesavulu5
Copy link
Author

Hi @yankay

create docker file to host vllm image
FROM vllm/vllm-openai:latest

ENV API_KEY="11611-22722-33833-44944-55055"

RUN apt-get update && apt-get install -y
git
wget
vim
&& rm -rf /var/lib/apt/lists/*

COPY ./Meta-Llama-3.1-8B-Instruct-Q8_0.gguf /Meta-Llama-3.1-8B-Instruct-Q8_0.gguf
EXPOSE 8000

ENTRYPOINT ["vllm-openai", "--model", "/Meta-Llama-3.1-8B-Instruct-Q8_0.gguf"]

Run the image with docker run -it --rm -p 8666:8000 --ipc=host --gpus "device=0" --name vllm318b vllm/vllm318b:1
gives docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "vllm-openai": executable file not found in $PATH: unknown

Run orginal image to see PATH not found docker run -it --entrypoint /bin/bash vllm/vllm-openai:latest find / -name vllm-openai

Please help me with any suggestion or fix

@DarkLight1337
Copy link
Member

If you look at the Dockerfiles in the vLLM repo, the ENTRYPOINT command should be ["python3", "-m", "vllm.entrypoints.openai.api_server"] plus extra arguments.

@Chennakesavulu5
Copy link
Author

@DarkLight1337 , thanks My friend it worked.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants