-
-
Notifications
You must be signed in to change notification settings - Fork 10.5k
Description
I have already reviewed the PR at #10463, but this issue still persists.
docker:vllm/vllm-openai:latest
parameters:--model /data/models/mistralai/Mistral-Large-Instruct-2411 --disable-log-requests --trust-remote-code --enforce-eager --enable-auto-tool-choice --tool-call-parser mistral --tokenizer-mode mistral --config-format mistral
code:
from openai import OpenAI
client = OpenAI(base_url="http://xxx/v1",
api_key="sk-xxx")
response = client.chat.completions.create(
model="/data/models/mistralai/Mistral-Large-Instruct-2411",
messages=[{"role": "user", "content": "hello", "name": "bob"}],
)
print(response.choices[0].message)
error: openai.BadRequestError: Error code: 400 - {'error': {'message': "1 validation error for ChatCompletionRequest\nmessages.0.user.name\n Extra inputs are not permitted [type=extra_forbidden, input_value='bob', input_type=str]\n For further information visit https://errors.pydantic.dev/2.10/v/extra_forbidden (request id: 2025020619014024915328971379754)", 'type': 'upstream_error', 'param': '400', 'code': 'bad_response_status_code'}}