-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Initial Checks
- I confirm that I'm using the latest version of Pydantic AI
- I confirm that I searched for my issue in https://github.com/pydantic/pydantic-ai/issues before opening this issue
Description
When I run an agent, only providing message_history to it then it never calls the LLM and simply returns the last message from the provided message_history.
The official example from https://ai.pydantic.dev/message-history/#summarize-old-messages is affected.
So, running this:
user_prompt = None
summary = await summarize_agent.run(user_prompt, message_history=message_history)
- never calls the LLM
- returns the last message from the history as output
- new_messages() are an empty list
but this is fine:
user_prompt = ""
summary = await summarize_agent.run(user_prompt, message_history=message_history)
- calls the LLM
- returns summarized messages
- new_messages() includes ModelRequest (with "") and ModelResponse (with summary)
Example Code
from pydantic_ai import Agent
from pydantic_ai import ModelMessage
from pydantic_ai import ModelRequest
from pydantic_ai import ModelResponse
from pydantic_ai import TextPart
from pydantic_ai import UserPromptPart
summarize_agent = Agent(
"openai:gpt-5-mini",
instructions="""
Summarize this conversation to include all important facts about the user and
what their interactions were about.
""",
)
message_history = [
ModelRequest(parts=[UserPromptPart(content="Hi, my name is James")]),
ModelResponse(parts=[TextPart(content="Nice to meet you, James.")]),
ModelRequest(parts=[UserPromptPart(content="I like cars")]),
ModelResponse(parts=[TextPart(content="I like them too. Sport cars?")]),
ModelRequest(parts=[UserPromptPart(content="No, cars in general.")]),
ModelResponse(parts=[TextPart(content="Awesome. Which one do you like most?")]),
ModelRequest(parts=[UserPromptPart(content="Fiat 126p")]),
ModelResponse(parts=[TextPart(content="That's an old one, isn't it?")]),
ModelRequest(parts=[UserPromptPart(content="Yes, it is. My parents had one.")]),
ModelResponse(parts=[TextPart(content="Cool. Was it fast?")]),
]
user_prompt = None
summary = await summarize_agent.run(user_prompt, message_history=message_history)
print(summary.output) # returns the last message from history: "Cool. Was it fast?"
print(summary.new_messages()) # returns []Python, Pydantic AI & LLM client version
pydanticai: 1.21.0
LLM: gpt-5-mini
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working