Skip to content

Add support for vllm >= 0.19.0#1211

Merged
NathanHB merged 7 commits intomainfrom
vllm-0.19-compat
Apr 13, 2026
Merged

Add support for vllm >= 0.19.0#1211
NathanHB merged 7 commits intomainfrom
vllm-0.19-compat

Conversation

@lewtun
Copy link
Copy Markdown
Member

@lewtun lewtun commented Apr 12, 2026

Fixes needed due to refactoring in vllm-project/vllm#35182

This PR also exposed an issue in the MCQ tests: for tied logits, different kernels produce different choices. To handle that, I've added an epsilon threshold and excluded tied logits from the comparisons.

lewtun and others added 2 commits April 12, 2026 17:50
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
@bot-ci-comment
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Comment thread src/lighteval/tasks/tasks/hellaswag.py
Comment thread tests/unit/tasks/test_registry.py
Comment thread src/lighteval/models/vllm/vllm_model.py Outdated
Comment thread src/lighteval/models/vllm/vllm_model.py Outdated
Comment thread src/lighteval/models/vllm/vllm_model.py Outdated
Comment thread src/lighteval/models/vllm/vllm_model.py Outdated
lewtun and others added 2 commits April 12, 2026 18:30
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
Comment thread src/lighteval/models/vllm/vllm_model.py
Comment thread src/lighteval/models/vllm/vllm_model.py
@lewtun lewtun requested a review from NathanHB April 12, 2026 20:34
Co-authored-by: OpenAI Codex <codex@openai.com>
@lewtun
Copy link
Copy Markdown
Member Author

lewtun commented Apr 13, 2026

The failing tests seem unrelated to this PR and are flaky (probably worth fixing in a separate PR)

from vllm.inputs import TokenInputs

# Convert token IDs to TokensPrompt format for vLLM v0.15+
prompts = [TokenInputs(prompt_token_ids=token_ids) for token_ids in inputs]
Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TokenInputs was renamed to TokensInput, but the actual object we need is really TokensPrompt

lewtun and others added 2 commits April 13, 2026 10:37
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
@NathanHB NathanHB merged commit 34889df into main Apr 13, 2026
5 checks passed
@lewtun lewtun deleted the vllm-0.19-compat branch April 13, 2026 12:05
dyurchenko98 pushed a commit to dyurchenko98/lighteval that referenced this pull request Apr 14, 2026
* Fix vLLM 0.11 compatibility and restore hellaswag_cf

Co-authored-by: OpenAI Codex <codex@openai.com>

* Support vLLM 0.19 prompt schema

Co-authored-by: OpenAI Codex <codex@openai.com>

* Address vLLM PR review feedback

Co-authored-by: OpenAI Codex <codex@openai.com>

* Remove temporary hellaswag_cf task

Co-authored-by: OpenAI Codex <codex@openai.com>

* Clarify vLLM compatibility branches

Co-authored-by: OpenAI Codex <codex@openai.com>

* Handle tied MCQ logits in slow sample comparisons

Co-authored-by: OpenAI Codex <codex@openai.com>

* Handle flat VLM token outputs in tie checks

Co-authored-by: OpenAI Codex <codex@openai.com>

---------

Co-authored-by: OpenAI Codex <codex@openai.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants