Skip to content

[BUG] vLLMWrapper prompt_logprobs kwarg should not be a boolean #3011

@albertbou92

Description

@albertbou92

Describe the bug

I am might be missing sth here because the SOTA code works fine, but in theory the vllm.SamplingParams prompt_logprobs parameter can not be a boolean as it is defined here in the class vLLMWrapper:
https://github.com/pytorch/rl/blob/main/torchrl/modules/llm/policies/vllm_wrapper.py#L213

Instead, the prompt_logprobs should be None by default and 1 if generate is not provided.

Seen in the vllm docs: https://docs.vllm.ai/en/v0.7.0/api/inference_params.html

Checklist

  • I have checked that there is no similar issue in the repo (required)
  • I have read the documentation (required)
  • I have provided a minimal working example to reproduce the bug (required)

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions