Skip to content

litellm backend params / model options improvements #151

@jakelorocco

Description

@jakelorocco

Based on a slack thread. All relevant info duplicated here.

I think litellm is dropping too many model_options parameters which is making it difficult to work with. I believe this is coming from our use of get_supported_openai_params which only shows the supported openai params. I believe it will reject params that are supported by the provider/endpoint but that are not openai params. It may also be the drop_params=True setting.

For example, when I try to target RITS with LiteLLM, and set the appropriate model opts, my request still fails because it drops these keys:

"api_key": rits_apikey, "extra_headers": {"RITS_API_KEY": rits_apikey}

I think there's also some improvements that could be made to the backend init:

  • allow an api_key to be passed in; even though litellm handles these as model options, it's unintuitive compared to how the other backends operate
  • I believe the default model_id should actually be using ollama_chat

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions