Description
Describe the bug
I am encountering an UnsupportedParamsError
when using the perplexity/sonar model via LiteLLM. The error message indicates that the sonar model does not support parameter 'tools'. The underlying litellm library suggests setting litellm.drop_params=True
to resolve this. But, the current implementation of google.adk.models.lite_llm.LiteLlm
does not provide a direct way to pass this drop_params configuration to the underlying litellm.completion
or litellm.acompletion
calls on a per-instance basis.
Error Message
An error occurred: litellm.UnsupportedParamsError: perplexity does not support parameters: ['tools'], for model=sonar. To drop these, set `litellm.drop_params=True` or for proxy:
`litellm_settings:
drop_params: true`
If you want to use these params dynamically send allowed_openai_params=['tools'] in your request.
To Reproduce
Steps to reproduce the behavior:
- Install litellm
- create an agent with perplexity/sonar model using litellm
Expected behavior
I expect to be able to configure drop_params when initializing LiteLlm from google.adk.models.lite_llm, similar to how other LiteLLM-specific parameters might be passed. This would allow me to gracefully handle models that do not support certain parameters (like tools) without having to set litellm.drop_params = True globally.
Desktop (please complete the following information):
- OS: macOS
- Python version(python -V): 3.13.5
- ADK version(pip show google-adk): 1.5.0
- litellm version: 1.73.6
Model Information:
perplexity/sonar
Additional context
Add any other context about the problem here.