Skip to content

[FT] Upgrade the VLLM dependency to 0.10.2+ #1002

@JIElite

Description

@JIElite

Issue encountered

Some newest LLM models can only supported by the new version VLLM. Currently, lighteval only support VLLM >= 0.10.0 and < 0.10.2

Solution/Feature

I think the main reason to this issue is VLLM >= 0.10.2 use different interface for LLM inference (more specifically, llm.generate) . The VLLM >= 0.10.2 doesn't support the keyword argument prompt_token_ids in case1 and case2

We can see there are multiple interfaces of llm.generate in VLLM==0.10.1.1 in this page: Link, where also indicates prompt_token_ids will be deprecated.

Image

However, in the VLLM==0.10.2, there is only one single interface in the official document: Link, which accept the inputs in PromptType. To my best knowledge, the inputs in _generate ref. function are assumed to be tokenized, so all we need to do is wrap the original inputs with TokensPrompt ref. which contains the attribute prompt_token_ids: list[int]

The solution

  1. Insert on line code before L418
inputs = [TokensPrompt(prompt_token_ids=input_ids) for input_ids in inputs]
  1. Remove the keyword argument prompt_token_ids at L440 and L458

  2. Update pyproject.toml, to remove <0.10.2 in vllm installation

Done.

Possible alternatives

A clear and concise description of any alternative solutions or features you've considered.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions