-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Closed
Description
I was referred to this project by:
open-webui/open-webui#11874 (comment)
And it might make my configuration a lot easier, however it seems like lightllm has the same issue as OpenWebUI:
- model_name: openai/codex-mini-latest
litellm_params:
model: openai/codex-mini-latest
api_key: "sk-"
Setup according to this:
https://docs.litellm.ai/docs/tutorials/openweb_ui
I guess LiteLLM currently cannot proxy/route chat/completions requests to the new responses API, correct? Or am I just missing a configuration parameter.
One other thing I noticed is that when I do:
- model_name: openai/*
litellm_params:
model: openai/*
api_key: "sk-"
Then the codex-mini-latest model is not even available in the list in /models, so there might be another issue there.
Thanks for any help. If this is in fact missing and you agree to have this feature I might try to send a PR (if there isn't already one)
nikhilmaddirala and almirtoledo
Metadata
Metadata
Assignees
Labels
No labels