-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Open
Labels
Description
Initial Checks
- I confirm that I'm using the latest version of Pydantic AI
- I confirm that I searched for my issue in https://github.com/pydantic/pydantic-ai/issues before opening this issue
Description
When using FallbackModel, the fallback is only triggered if the error occurs during the model request itself.
However, if the model returns an invalid/empty response and pydantic_ai raises UnexpectedModelBehavior during response handling inside the agent’s graph, the fallback is not used.
This makes FallbackModel less useful, since many real-world failures happen at the parsing/response validation stage, not just at request time.
Example Code
from pydantic_ai.exceptions import UnexpectedModelBehavior
from pydantic_ai.models import Model
from pydantic_ai.models.bedrock import BedrockConverseModel, BedrockModelProfile
from pydantic_ai.models.fallback import FallbackModel
from pydantic_ai.providers.bedrock import BedrockProvider
from pydantic_ai.settings import ModelSettings
def get_fallback_model() -> Model:
return BedrockConverseModel(
model_name="meta.llama3-3-70b-instruct-v1:0",
provider=BedrockProvider(region_name="us-east-2"),
profile=BedrockModelProfile(bedrock_supports_tool_choice=False),
)
def get_model(model_name: str, **settings) -> Model:
return FallbackModel(
BedrockConverseModel(
model_name=model_name,
provider=BedrockProvider(region_name="us-west-2"),
profile=BedrockModelProfile(bedrock_supports_tool_choice=False),
settings=ModelSettings(**settings),
),
get_fallback_model(),
fallback_on=(UnexpectedModelBehavior,),
)
Running this sometimes raises:
pydantic_ai.exceptions.UnexpectedModelBehavior: Received empty model response
But instead of falling back to the second model, the exception is propagated.Python, Pydantic AI & LLM client version
Python: 3.13.3
PydanticAI: Latest
lcy0321 and MysticSeawolf