-
Notifications
You must be signed in to change notification settings - Fork 366
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Compatibility issue with vllm 0.32 #704
Comments
I'm also having this issue. Are you having any concerns downgrading |
vLLM is working to integrate outlines vllm-project/vllm#2819. Once they have it integrated hopefully this kind of error won't pop up and the systems will retain compatability. To resolve this we can simply apply
Good first PR for anyone looking to start contributing! |
For CFGFSM, we might choose a hacky solution for now that resolves this issue, otherwise trying to deepcopy a
The hacky solution would be to construct a brand new
|
I've encountered this issue as well and have attempted a fix here: #711 @p-usefulbird Can you test and see if that fixes it? |
When integrating Outlines with vLLM I faced the following issues, which are fixed in this PR: 1. When calling `vllm.LLM.generate` then within the internals of vLLM a `copy.deepcopy` of the vLLM `SamplingParams` is made, which includes the logits processor from Outlines (`RegexLogitsProcessor`, say). This requires everything to be pickleable, and the `RegexLogitsProcessor.fsm.vocabulary` is a `dict_values` object, which doesn't satisfy that. The fix is easy: just convert it to a list. This doesn't affect how this `vocabulary` variable is being used in the code. 2. The `RegexLogitsProcessor` takes an `llm` argument, which the docstring states should be a `vllm.LLM` object, but then attempts to extract the underlying tokenizer via `llm.tokenizer.tokenizer`. The tokenizer of `vllm.LLM` currently lies in the `vllm.LLM.llm_engine.tokenizer.tokenizer` attribute, but this is a big mess and isn't backwards compatible with previous vLLM versions. Instead, they have a convenience method, `vllm.LLM.get_tokenizer`, which fetches the tokenizer. To remain backwards compatibility, in case people have supplied `vllm.LLM.llm_engine` directly into `RegexLogitsProcessor`, it falls back to a `tokenizer` or `tokenizer.tokenizer` attribute. I also updated the vLLM example script, as that was outdated as well (used the previous `_patched_apply_logits_processors`). Closes #704
Describe the issue as clearly as possible:
A recent merge into vllm broke the integration with outlines. The deepcopy introduced does not work with outlines due to the dictionaries nested. This throws the following error:
TypeError: cannot pickle 'dict_values' object
vllm-project/vllm#2881
Steps/code to reproduce the bug:
Expected result:
Error message:
Outlines/Python version information:
Version information
Context for the issue:
No response
The text was updated successfully, but these errors were encountered: