Conversation
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
lewtun
commented
Apr 12, 2026
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
lewtun
commented
Apr 12, 2026
Co-authored-by: OpenAI Codex <codex@openai.com>
Member
Author
|
The failing tests seem unrelated to this PR and are flaky (probably worth fixing in a separate PR) |
lewtun
commented
Apr 13, 2026
| from vllm.inputs import TokenInputs | ||
|
|
||
| # Convert token IDs to TokensPrompt format for vLLM v0.15+ | ||
| prompts = [TokenInputs(prompt_token_ids=token_ids) for token_ids in inputs] |
Member
Author
There was a problem hiding this comment.
TokenInputs was renamed to TokensInput, but the actual object we need is really TokensPrompt
NathanHB
approved these changes
Apr 13, 2026
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
dyurchenko98
pushed a commit
to dyurchenko98/lighteval
that referenced
this pull request
Apr 14, 2026
* Fix vLLM 0.11 compatibility and restore hellaswag_cf Co-authored-by: OpenAI Codex <codex@openai.com> * Support vLLM 0.19 prompt schema Co-authored-by: OpenAI Codex <codex@openai.com> * Address vLLM PR review feedback Co-authored-by: OpenAI Codex <codex@openai.com> * Remove temporary hellaswag_cf task Co-authored-by: OpenAI Codex <codex@openai.com> * Clarify vLLM compatibility branches Co-authored-by: OpenAI Codex <codex@openai.com> * Handle tied MCQ logits in slow sample comparisons Co-authored-by: OpenAI Codex <codex@openai.com> * Handle flat VLM token outputs in tie checks Co-authored-by: OpenAI Codex <codex@openai.com> --------- Co-authored-by: OpenAI Codex <codex@openai.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Fixes needed due to refactoring in vllm-project/vllm#35182
This PR also exposed an issue in the MCQ tests: for tied logits, different kernels produce different choices. To handle that, I've added an epsilon threshold and excluded tied logits from the comparisons.