Skip to content

"Encrypted content is not supported with this model" for gpt-5-chat-latest #3326

@andrzej-pomirski-yohana

Description

@andrzej-pomirski-yohana

Initial Checks

Description

I am trying to use the Responses API to run my model to potentially gain some performance (need to squeeze everything I can from the models).

The code (in dependency-injector for Python) of my container is:

class ExampleContainer(DeclarativeContainer):
    llm_provider = Singleton(LiteLLMProvider, api_base=llm_base, api_key=llm_key)
    model = Singleton(
        OpenAIResponsesModel,
        model_name="openai/gpt-5-chat-latest",
        provider=llm_provider
    )

This results in

pydantic_ai.exceptions.ModelHTTPError: status_code: 400, model_name: openai/gpt-5-chat-latest, body: {'message': 'litellm.BadRequestError: OpenAIException - {\n  "error": {\n    "message": "Encrypted content is not supported with this model.",\n    "type": "invalid_request_error",\n    "param": "include",\n    "code": null\n  }\n}No fallback model group found for original model_group=openai/gpt-5-chat-latest. Fallbacks=[]. Received Model Group=openai/gpt-5-chat-latest\nAvailable Model Group Fallbacks=None\nError doing the fallback: litellm.BadRequestError: OpenAIException - {\n  "error": {\n    "message": "Encrypted content is not supported with this model.",\n    "type": "invalid_request_error",\n    "param": "include",\n    "code": null\n  }\n}No fallback model group found for original model_group=openai/gpt-5-chat-latest. Fallbacks=[] LiteLLM Retried: 1 times, LiteLLM Max Retries: 2', 'type': None, 'param': None, 'code': '400'}

However, if I add

        profile=replace(
            OpenAIModelProfile.from_profile(openai_model_profile("gpt-5-chat-latest")),
            openai_supports_encrypted_reasoning_content=False,
        ),

to OpenAIResponsesModel, my model works.

The same code (without custom profile) also works for gpt-5, so I assume that the issue is in the gpt-5-chat-latest model not working with the field.

Example Code

Python, Pydantic AI & LLM client version

Using pydantic-ai-slim 1.9.1 with Python 3.12. OpenAI lib version 2.7.0. LiteLLM server 1.78.5.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions