Warning when i use gemini via LiteLLM #5589
Replies: 2 comments
-
|
Hi thanks for the question! Using Gemini through LiteLLM works and is fine for dev/POC, but there are some tradeoffs to be aware of for production. Why the warning existsTo be honest ADK's native Gemini integration talks directly to the Generative Language API,
LiteLLM normalizes responses to OpenAI format first, which can silently drop For your use case (org-wide cost tracking via LiteLLM gateway)That's a valid constraint. Practical options:
# Replace:
LiteLlm(model='gemini-2.5-pro')
# With:
Gemini(model='gemini-2.5-pro')
Bottom line: LiteLLM will work, but validate streaming, throughput, and structured output behavior carefully before production. If you hit issues, switching to native Gemini is the first thing to try. |
Beta Was this translation helpful? Give feedback.
-
|
To add to the above if your org requires LiteLLM for cost tracking, the warning is safe to suppress. Just make sure to test streaming and tool calls thoroughly before production, as those are the areas most likely to behave differently through the LiteLLM abstraction layer. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I was doing a poc and noticed that when using gemini models via LiteLLM i see below warning
Wanted to understand more on this whats the downside if using via LiteLLM as my org is standardizing cost tracking so we need to use all LLMs via litellm gateway only
Beta Was this translation helpful? Give feedback.
All reactions