-
Notifications
You must be signed in to change notification settings - Fork 235
Description
Description
After restarting the service, existing session data (events/history) is correctly loaded from storage and the session object is created. However, the LLM does not seem to have this restored context in its working memory.
As a result, even though the session exists and the past conversation events are present in the database, the agent is unable to recall previous information when the user continues the conversation. This leads to the LLM behaving as if no prior context exists.
In short: session state persists, but LLM internal memory does not refresh from the restored session.
To Reproduce
- Run the service and start a conversation with the agent.
- Provide information to the agent that it should “remember.”
- Restart the service.
- Continue the conversation and ask the agent about previously shared information.
- The agent will not recall the context, even though the session and events were reloaded from storage.
Expected behavior
After service restart, the agent should load the previous session context not only into its session store, but also into the LLM working memory.
The agent should correctly recall previously shared information without requiring the user to repeat it.
Screenshots
Read lower messages first. First 2 messages are before restart. Second 2 messages are after the restart. After the restart I set the same session object, but agent already doesn't remember what is in session. Seems like after restart all objects event are needed to be added to LLM context.
Desktop
- OS: Linux
- Java version: 17
- ADK version 0.3.0
Additional context
This is especially important for long-running conversational agents where persistence and continuity across restarts are expected.
Currently, only session metadata is restored, but the conversational context is not re-hydrated into the model memory, causing loss of continuity for the user.
A possible solution could be automatically re-injecting the session history into the LLM context after service restart, before the first user message is processed.