Update dependency litellm to v1.60.8 #2028
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR contains the following updates:
1.59.7->1.60.8Release Notes
BerriAI/litellm (litellm)
v1.60.8What's Changed
/cache/ping+ add timeout value and elapsed time on azure + http calls by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8377/bedrock/invokesupport for all Anthropic models by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8383Full Changelog: BerriAI/litellm@v1.60.6...v1.60.8
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.6Compare Source
What's Changed
choices=[]by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8339choices=[]on llm responses by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8342New Contributors
Full Changelog: BerriAI/litellm@v1.60.5...v1.60.6
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.5Compare Source
What's Changed
BaseLLMHTTPHandlerclass by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8290New Contributors
Full Changelog: BerriAI/litellm@v1.60.4...v1.60.5
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.4Compare Source
What's Changed
bedrock/novamodels + add utillitellm.supports_tool_choiceby @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8264rolebased access to proxy by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8260Full Changelog: BerriAI/litellm@v1.60.2...v1.60.4
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.2Compare Source
What's Changed
sso_user_idto LiteLLM_UserTable by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8167/vertex_ai/was not detected as llm_api_route on pass through butvertex-aiwas by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8186modeas list, fix valid keys error in pydantic, add more testing by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8224New Contributors
Full Changelog: BerriAI/litellm@v1.60.0...v1.60.2
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.0What's Changed
Important Changes between v1.50.xx to 1.60.0
def async_log_stream_eventanddef log_stream_eventno longer supported forCustomLoggershttps://docs.litellm.ai/docs/observability/custom_callback. If you want to log stream events usedef async_log_success_eventanddef log_success_eventfor logging success stream eventsKnown Issues
🚨 Detected issue with Langfuse Logging when Langfuse credentials are stored in DB
bedrockmodels + showend_userby @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8118keyTeam.team_alias === "Default Team"by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8122LoggingCallbackManagerto append callbacks and ensure no duplicate callbacks are added by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8112litellm.disable_no_log_paramparam by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8134litellm.turn_off_message_logging=Trueby @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8156New Contributors
Full Changelog: BerriAI/litellm@v1.59.10...v1.60.0
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.59.10Compare Source
What's Changed
modelparam by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8105bedrock/converse_like/<model>route by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8102Full Changelog: BerriAI/litellm@v1.59.9...v1.59.10
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.59.9Compare Source
What's Changed
metadataparam preview support + newx-litellm-timeoutrequest header by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8047New Contributors
Full Changelog: BerriAI/litellm@v1.59.8...v1.59.9
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.59.8Compare Source
What's Changed
LANGFUSE_FLUSH_INTERVALby @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8007Full Changelog: BerriAI/litellm@v1.59.7...v1.59.8
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Configuration
📅 Schedule: Branch creation - "every weekend" in timezone US/Eastern, Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.