Skip to content

Setting max_tokens for openai's o3-mini model throws 400 error #1205

@barapa

Description

@barapa

Initial Checks

  • I confirm that I'm using the latest version of Pydantic AI

Description

{
"message": "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.",
"type": "invalid_request_error",
"param": "max_tokens",
"code": "unsupported_parameter"
}

Example Code

Python, Pydantic AI & LLM client version

Name: pydantic-ai
Version: 0.0.42
Location: /Users/br/development/projects/legaide-ai/.venv/lib/python3.12/site-packages
Requires: pydantic-ai-slim
Required-by: legaide-ai

Python 3.12.9

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions