Skip to content

[Doc][KubeRay] Remove vllm-rayservice.md and use Ray Serve LLM instead #53844

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 16, 2025

Conversation

kevin85421
Copy link
Member

@kevin85421 kevin85421 commented Jun 16, 2025

Why are these changes needed?

https://docs.ray.io/en/master/cluster/kubernetes/examples/rayserve-llm-example.html#kuberay-rayservice-llm-example

Currently, users can use Ray Serve LLM to run LLM serving workloads. Remove the old vLLM / RayService example guide.

Related issue number

Checks

  • I've signed off every commit(by using the -s flag, i.e., git commit -s) in this PR.
  • I've run scripts/format.sh to lint the changes in this PR.
  • I've included any doc changes needed for https://docs.ray.io/en/master/.
    • I've added any new APIs to the API Reference. For example, if I added a
      method in Tune, I've added it in doc/source/tune/api/ under the
      corresponding .rst file.
  • I've made sure the tests are passing. Note that there might be a few flaky tests, see the recent failures at https://flakey-tests.ray.io/
  • Testing Strategy
    • Unit tests
    • Release tests
    • This PR is not tested :(

Signed-off-by: Kai-Hsun Chen <kaihsun@anyscale.com>
@kevin85421
Copy link
Member Author

cc @andrewsykim does it make sense to you to remove the old one and use Ray Serve LLM instead?

@kevin85421 kevin85421 added the go add ONLY when ready to merge, run all tests label Jun 16, 2025
@kevin85421 kevin85421 marked this pull request as ready for review June 16, 2025 06:20
@Copilot Copilot AI review requested due to automatic review settings June 16, 2025 06:20
@kevin85421 kevin85421 requested review from pcmoritz and a team as code owners June 16, 2025 06:20
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR updates the KubeRay documentation by removing the outdated vLLM-RayService example and aligning the examples index with the new Ray Serve LLM example.

  • Remove the old vLLM-RayService guide.
  • Update the examples index to reference the new Ray Serve LLM example.

Reviewed Changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.

File Description
doc/source/cluster/kubernetes/examples/vllm-rayservice.md Removed the outdated vLLM-RayService example guide.
doc/source/cluster/kubernetes/examples.md Removed the reference to the outdated guide to align with the updated Ray Serve LLM documentation.

@kevin85421
Copy link
Member Author

cc @jjyao @edoakes would you mind merging this PR? Thanks!

@edoakes edoakes merged commit 739f1ac into ray-project:master Jun 16, 2025
5 checks passed
elliot-barn pushed a commit that referenced this pull request Jun 18, 2025
…ead (#53844)

https://docs.ray.io/en/master/cluster/kubernetes/examples/rayserve-llm-example.html#kuberay-rayservice-llm-example

Currently, users can use Ray Serve LLM to run LLM serving workloads.
Remove the old vLLM / RayService example guide.

Signed-off-by: Kai-Hsun Chen <kaihsun@anyscale.com>
Signed-off-by: elliot-barn <elliot.barnwell@anyscale.com>
rebel-scottlee pushed a commit to rebellions-sw/ray that referenced this pull request Jun 21, 2025
…ead (ray-project#53844)

https://docs.ray.io/en/master/cluster/kubernetes/examples/rayserve-llm-example.html#kuberay-rayservice-llm-example

Currently, users can use Ray Serve LLM to run LLM serving workloads.
Remove the old vLLM / RayService example guide.

Signed-off-by: Kai-Hsun Chen <kaihsun@anyscale.com>
Signed-off-by: Scott Lee <scott.lee@rebellions.ai>
minerharry pushed a commit to minerharry/ray that referenced this pull request Jun 27, 2025
…ead (ray-project#53844)

https://docs.ray.io/en/master/cluster/kubernetes/examples/rayserve-llm-example.html#kuberay-rayservice-llm-example

Currently, users can use Ray Serve LLM to run LLM serving workloads.
Remove the old vLLM / RayService example guide.

Signed-off-by: Kai-Hsun Chen <kaihsun@anyscale.com>
elliot-barn pushed a commit that referenced this pull request Jul 2, 2025
…ead (#53844)

https://docs.ray.io/en/master/cluster/kubernetes/examples/rayserve-llm-example.html#kuberay-rayservice-llm-example

Currently, users can use Ray Serve LLM to run LLM serving workloads.
Remove the old vLLM / RayService example guide.

Signed-off-by: Kai-Hsun Chen <kaihsun@anyscale.com>
Signed-off-by: elliot-barn <elliot.barnwell@anyscale.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
go add ONLY when ready to merge, run all tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants