Describe the Bug
When an ADK app uses google.adk.models.lite_llm.LiteLlm as the root agent model, adk web loads the app successfully but the dev builder endpoint crashes when rendering the graph.
The failing endpoint is:
GET /dev/build_graph/{app_name}
The crash happens because LiteLlm contains an internal llm_client: LiteLLMClient field, and ADK's builder serialization path calls model_dump() on the model object. That includes the LiteLLMClient instance, which Pydantic cannot serialize.
Steps to Reproduce
- Install
google-adk with LiteLLM support.
- Create a minimal agent like this:
from google.adk.agents import Agent
from google.adk.models.lite_llm import LiteLlm
root_agent = Agent(
name="root_agent",
model=LiteLlm(model="openrouter/openai/gpt-5.4-mini"),
instruction="You are a helpful assistant.",
)
- Run:
- Open the dev UI or request:
http://127.0.0.1:8000/dev/build_graph/src
Expected Behavior
/dev/build_graph/{app_name} should return the serialized graph JSON and the dev UI should render normally.
Observed Behavior
/dev/build_graph/{app_name} returns 500 Internal Server Error.
Traceback:
pydantic_core._pydantic_core.PydanticSerializationError: Unable to serialize unknown type: <class 'google.adk.models.lite_llm.LiteLLMClient'>
In my local repro, LiteLlm.model_dump(mode="python", exclude_none=True) includes:
{
"model": "openrouter/openai/gpt-5.4-mini",
"llm_client": <google.adk.models.lite_llm.LiteLLMClient object at ...>,
}
Root Cause
The builder endpoint in google/adk/cli/adk_web_server.py serializes agent fields like this:
elif hasattr(value, "model_dump"):
agent_dict[field_name] = value.model_dump(mode="python", exclude_none=True)
For LiteLlm, that includes llm_client, which is not serializable.
LiteLlm currently defines:
llm_client: LiteLLMClient = Field(default_factory=LiteLLMClient)
so the model shape itself is not safe for this serialization path.
Environment Details
- ADK Library Version: 1.31.0
- Desktop OS: Windows 11
- Python Version: 3.12
Model Information
- Are you using LiteLLM: Yes
- Which model is being used:
openrouter/openai/gpt-5.4-mini
Additional Context
The app itself imports and loads correctly. The failure is specific to the ADK Web builder serialization path.
As a local workaround, excluding llm_client from Pydantic serialization fixes the issue, e.g. by marking the field exclude=True in a subclass of LiteLlm.
That suggests the fix likely belongs either in:
LiteLlm itself (exclude llm_client from serialization), or
- the
/dev/build_graph serializer (special-case model objects / use a safer fallback path for non-serializable fields).
Describe the Bug
When an ADK app uses
google.adk.models.lite_llm.LiteLlmas the root agent model,adk webloads the app successfully but the dev builder endpoint crashes when rendering the graph.The failing endpoint is:
The crash happens because
LiteLlmcontains an internalllm_client: LiteLLMClientfield, and ADK's builder serialization path callsmodel_dump()on the model object. That includes theLiteLLMClientinstance, which Pydantic cannot serialize.Steps to Reproduce
google-adkwith LiteLLM support.Expected Behavior
/dev/build_graph/{app_name}should return the serialized graph JSON and the dev UI should render normally.Observed Behavior
/dev/build_graph/{app_name}returns500 Internal Server Error.Traceback:
In my local repro,
LiteLlm.model_dump(mode="python", exclude_none=True)includes:{ "model": "openrouter/openai/gpt-5.4-mini", "llm_client": <google.adk.models.lite_llm.LiteLLMClient object at ...>, }Root Cause
The builder endpoint in
google/adk/cli/adk_web_server.pyserializes agent fields like this:For
LiteLlm, that includesllm_client, which is not serializable.LiteLlmcurrently defines:so the model shape itself is not safe for this serialization path.
Environment Details
Model Information
openrouter/openai/gpt-5.4-miniAdditional Context
The app itself imports and loads correctly. The failure is specific to the ADK Web builder serialization path.
As a local workaround, excluding
llm_clientfrom Pydantic serialization fixes the issue, e.g. by marking the fieldexclude=Truein a subclass ofLiteLlm.That suggests the fix likely belongs either in:
LiteLlmitself (excludellm_clientfrom serialization), or/dev/build_graphserializer (special-case model objects / use a safer fallback path for non-serializable fields).