-
Notifications
You must be signed in to change notification settings - Fork 6.6k
[Doc][KubeRay] Remove vllm-rayservice.md
and use Ray Serve LLM instead
#53844
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
cc @andrewsykim does it make sense to you to remove the old one and use Ray Serve LLM instead? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR updates the KubeRay documentation by removing the outdated vLLM-RayService example and aligning the examples index with the new Ray Serve LLM example.
- Remove the old vLLM-RayService guide.
- Update the examples index to reference the new Ray Serve LLM example.
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.
File | Description |
---|---|
doc/source/cluster/kubernetes/examples/vllm-rayservice.md | Removed the outdated vLLM-RayService example guide. |
doc/source/cluster/kubernetes/examples.md | Removed the reference to the outdated guide to align with the updated Ray Serve LLM documentation. |
…ead (#53844) https://docs.ray.io/en/master/cluster/kubernetes/examples/rayserve-llm-example.html#kuberay-rayservice-llm-example Currently, users can use Ray Serve LLM to run LLM serving workloads. Remove the old vLLM / RayService example guide. Signed-off-by: Kai-Hsun Chen <kaihsun@anyscale.com> Signed-off-by: elliot-barn <elliot.barnwell@anyscale.com>
…ead (ray-project#53844) https://docs.ray.io/en/master/cluster/kubernetes/examples/rayserve-llm-example.html#kuberay-rayservice-llm-example Currently, users can use Ray Serve LLM to run LLM serving workloads. Remove the old vLLM / RayService example guide. Signed-off-by: Kai-Hsun Chen <kaihsun@anyscale.com> Signed-off-by: Scott Lee <scott.lee@rebellions.ai>
…ead (ray-project#53844) https://docs.ray.io/en/master/cluster/kubernetes/examples/rayserve-llm-example.html#kuberay-rayservice-llm-example Currently, users can use Ray Serve LLM to run LLM serving workloads. Remove the old vLLM / RayService example guide. Signed-off-by: Kai-Hsun Chen <kaihsun@anyscale.com>
…ead (#53844) https://docs.ray.io/en/master/cluster/kubernetes/examples/rayserve-llm-example.html#kuberay-rayservice-llm-example Currently, users can use Ray Serve LLM to run LLM serving workloads. Remove the old vLLM / RayService example guide. Signed-off-by: Kai-Hsun Chen <kaihsun@anyscale.com> Signed-off-by: elliot-barn <elliot.barnwell@anyscale.com>
Why are these changes needed?
https://docs.ray.io/en/master/cluster/kubernetes/examples/rayserve-llm-example.html#kuberay-rayservice-llm-example
Currently, users can use Ray Serve LLM to run LLM serving workloads. Remove the old vLLM / RayService example guide.
Related issue number
Checks
git commit -s
) in this PR.scripts/format.sh
to lint the changes in this PR.method in Tune, I've added it in
doc/source/tune/api/
under thecorresponding
.rst
file.