-
Notifications
You must be signed in to change notification settings - Fork 2.5k
Description
Describe the bug
When using PlanReActPlanner, the first round of question and answer works as expected, with the final answer being generated after the /FINAL_ANSWER/ tag. However, in the subsequent turn of the conversation, the planner frequently only outputs the /FINAL_ANSWER/ tag without the accompanying final response.
Upon inspecting the source code, it appears that the final answer from the first turn is removed from the conversation history before being passed to the LLM in the second turn. This likely causes the LLM to imitate the pattern of the previous turn, resulting in the output of only the /FINAL_ANSWER/ tag without the actual answer.
To Reproduce
root_agent = LlmAgent(
model=LiteLlm(
model=f"hosted_vllm/{MODEL_NAME}",
api_base=BASE_URL,
# Pass authentication headers if needed
extra_headers={"Authorization": f"Bearer {BEARER_TOEKN}"},
temperature=0.3,
top_k=50,
top_p=0.95,
chat_template_kwargs={
"enable_thinking": False
}
),
name="你的智慧好夥伴",
description=ROLE,
static_instruction=SYSTEM_PROMPT5,
planner=PlanReActPlanner(),
tools=[mcp_tools] # mcp_tools only includes a tool that retrieve documents from milvus.
)And Run adk web --port 8888
Expected behavior
Generate the fully answer to the question after /FINAL_ANSWER/ tag .
Screenshots
First pic shows that the conversation ends only with /FINAL_ANSWER/

Second pic shows that the final answer from the first turn is removed from the conversation history before being passed to the LLM in the second turn

Third pic shows that LLM log did generate the final answer after /FINAL_ANSWER/ tag in the first turn

Desktop (please complete the following information):
- OS: Linux
- Python version(python -V):3.14.0
- ADK version(pip show google-adk):1.18.0
Model Information:
- Are you using LiteLLM: Yes
- Which model is being used: Qwen3-30B-A3B-Instruct-2ㄌ507-FP8
Additional context
- This behavior happens every time, even I try other model.