Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(ray_utils): ignore re-init error #465

Merged
merged 1 commit into from
Jul 20, 2023
Merged

Conversation

mspronesti
Copy link
Contributor

@mspronesti mspronesti commented Jul 14, 2023

Hi,
this simple PR aims at solving this annoying error raised by ray when creating multiple vllm.LLM objects

Exception: Perhaps you called ray.init twice by accident?

@WoosukKwon
Copy link
Collaborator

Hi @mspronesti, thanks for submitting the PR! Could you elaborate more on the error or provide a reproducible script that causes the error?

@mspronesti
Copy link
Contributor Author

Hi @WoosukKwon,

example script:

from vllm import LLM

llm1 = LLM(model="mosaicml/mpt-30b", tensor_parallel_size=4)
llm2 = LLM(model="mosaicml/mpt-30b", tensor_parallel_size=4)

which raises

RuntimeError: Maybe you called ray.init twice by accident? This error can be suppressed by passing in 'ignore_reinit_error=True' or by calling 'ray.shutdown()' prior to 'ray.init()'.

Same happens in a notebook, re-running the cell where a single vllm.LLM object is declared (which is how I noticed it)

Copy link
Collaborator

@zhuohan123 zhuohan123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for your contribution!

@zhuohan123 zhuohan123 merged commit 16c3e29 into vllm-project:main Jul 20, 2023
2 checks passed
hongxiayang pushed a commit to hongxiayang/vllm that referenced this pull request Feb 13, 2024
sjchoi1 pushed a commit to casys-kaist-internal/vllm that referenced this pull request May 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants