Skip to content

[BUG] structured_output with LiteLLM Proxy raises ValueError: Model does not support response_format #862

@ranajoy-dutta

Description

@ranajoy-dutta

Checks

  • I have updated to the lastest minor and patch version of Strands
  • I have checked the documentation and this is not expected behavior
  • I have searched ./issues and there are no duplicates of my issue

Strands Version

1.8.0

Python Version

3.11.13

Operating System

macOS 15.6.1

Installation Method

pip

Steps to Reproduce

When using Agent.structured_output(...) with a LiteLLMModel that points to a LiteLLM proxy, I get the following error:
ValueError: Model does not support response_format
This happens even though the underlying model (Claude 3.7 Sonnet via LiteLLM proxy) supports JSON / structured outputs.

Code to Reproduce

import os
from pydantic import BaseModel, Field
from strands import Agent
from strands.models.litellm import LiteLLMModel

class BookAnalysis(BaseModel):
    """Analyze a book's key information."""
    title: str = Field(description="The book's title")
    author: str = Field(description="The book's author")
    genre: str = Field(description="Primary genre or category")
    summary: str = Field(description="Brief summary of the book")
    rating: int = Field(description="Rating from 1-10", ge=1, le=10)

model = LiteLLMModel(
    client_args={
        "api_key": "DUMMY_API_KEY",         # replaced for repro
        "api_base": "https://<your-litellm-proxy-url>",
        "use_litellm_proxy": True
    },
    model_id="claude-3-7-sonnet-20250219"
)

agent = Agent(model=model)

result = agent.structured_output(
    BookAnalysis,
    """
    Analyze this book: "The Hitchhiker's Guide to the Galaxy" by Douglas Adams.
    It's a science fiction comedy about Arthur Dent's adventures through space
    after Earth is destroyed. It's widely considered a classic of humorous sci-fi.
    """
)

Expected Behavior

  • Agent.structured_output(...) should pass the Pydantic model schema to LiteLLM as response_format.
  • LiteLLM proxy should return JSON that conforms to the schema.
  • The response should be parsed back into the provided Pydantic model.

Actual Behavior

Traceback

  File "example.py", line 30, in <module>
    result = agent.structured_output(BookAnalysis, "Analyze this book ...")
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  ...
  File ".../strands/models/litellm.py", line 215, in structured_output
    raise ValueError("Model does not support response_format")
ValueError: Model does not support response_format

Additional Context

  • Works fine when calling litellm.completion(..., response_format=BookAnalysis) directly.
  • Works fine with strands-Litellm (without proxy)
  • Fails only when wrapped inside Agent.structured_output(...) with LiteLLMModel.
  • Possibly related to how LiteLLMModel checks response_format support.

Possible Solution

No response

Related Issues

No response

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions