-
-
Notifications
You must be signed in to change notification settings - Fork 4.8k
Closed
Labels
Description
What happened?
PR #16766 was merged in at night, and models started using responses instead of completions. The bridge does not work automatically for many cases, and structured outputs also failed for these requests.
Can we have an env var that does not auto-bridge completions to responses? Currently, our workaround is setting LITELLM_LOCAL_MODEL_COST_MAP="True".
Relevant log output
litellm.ContextWindowExceededError: litellm.BadRequestError: ContextWindowExceededError: OpenAIException - {
"error": {
"message": "Invalid 'metadata.schema_dict_json': string too long. Expected a string with maximum length 512, but got a string with length 1203 instead.",
"type": "invalid_request_error",
"param": "metadata.schema_dict_json",
"code": "string_above_max_length"
}
}Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.80.0
Twitter / LinkedIn details
No response
rzhannoy, simonebel, artplan1, aluqueclarity, eugepemi and 13 moreCakeCrusher