Skip to content

Releases: BerriAI/litellm

v1.72.6-stable

19 Jun 20:08
56aaaf7
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.72.6.post1-nightly...v1.72.6-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.72.6-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 269.27781931947453 6.111834388077504 0.0 1828 0 215.86210600003142 1630.9297619999938
Aggregated Passed ✅ 250.0 269.27781931947453 6.111834388077504 0.0 1828 0 215.86210600003142 1630.9297619999938

v1.72.6.post1-nightly

18 Jun 04:41
Compare
Choose a tag to compare

Full Changelog: v1.72.6.dev1...v1.72.6.post1-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.6.post1-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 190.0 209.8013988365269 6.275681933110413 0.0 1878 0 167.48262099997646 1487.4784890000115
Aggregated Passed ✅ 190.0 209.8013988365269 6.275681933110413 0.0 1878 0 167.48262099997646 1487.4784890000115

v1.72.6.devSCIM

18 Jun 17:44
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.72.6.dev1...v1.72.6.devSCIM

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.6.devSCIM

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 190.0 215.1720637640139 6.262237464870193 0.0 1873 0 171.28891599998042 1800.7898239999918
Aggregated Passed ✅ 190.0 215.1720637640139 6.262237464870193 0.0 1873 0 171.28891599998042 1800.7898239999918

v1.72.6.SCIM2

18 Jun 21:19
Compare
Choose a tag to compare

Full Changelog: v1.72.6.devSCIM...v1.72.6.SCIM2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.6.SCIM2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 190.0 213.45712869978374 6.190773809263607 0.0 1852 0 171.36217200004467 1296.009626
Aggregated Passed ✅ 190.0 213.45712869978374 6.190773809263607 0.0 1852 0 171.36217200004467 1296.009626

v1.72.6.rc

15 Jun 02:04
Compare
Choose a tag to compare

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.6.rc

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 190.0 206.1983524503451 6.292694671664114 0.0 1883 0 168.78388700001778 1487.5943659999962
Aggregated Passed ✅ 190.0 206.1983524503451 6.292694671664114 0.0 1883 0 168.78388700001778 1487.5943659999962

What's Changed

Read more

v1.72.6.dev1

15 Jun 01:08
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.72.6-nightly...v1.72.6.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.6.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 240.0 263.48009916102654 6.12478893553391 0.0 1832 0 214.1671110000516 1761.2241329999847
Aggregated Passed ✅ 240.0 263.48009916102654 6.12478893553391 0.0 1832 0 214.1671110000516 1761.2241329999847

v1.72.6-nightly

14 Jun 22:31
Compare
Choose a tag to compare

What's Changed

Full Changelog: 1.72.6.rc-draft...v1.72.6-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.6-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 265.04106068733097 6.154497639770629 0.0 1839 0 214.5358220000162 1687.9459110000425
Aggregated Passed ✅ 250.0 265.04106068733097 6.154497639770629 0.0 1839 0 214.5358220000162 1687.9459110000425

[DRAFT] 1.72.6.rc

14 Jun 15:44
Compare
Choose a tag to compare
[DRAFT] 1.72.6.rc Pre-release
Pre-release

What's Changed

New Contributors

Full Changelog: https://github.com/BerriAI/litellm/compare/v1.72.2-stable......

Read more

v1.72.2.devMCP

13 Jun 23:10
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.72.5.dev1...v1.72.2.devMCP

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.2.devMCP

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 220.0 241.96992280403583 6.294425384311064 0.0 1883 0 199.48631400001204 1258.8171310000007
Aggregated Passed ✅ 220.0 241.96992280403583 6.294425384311064 0.0 1883 0 199.48631400001204 1258.8171310000007

v1.72.5.dev1

11 Jun 17:59
Compare
Choose a tag to compare

What's Changed

  • fix(internal_user_endpoints.py): support user with + in email on user info + handle empty string for arguments on gemini function calls by @krrishdholakia in #11601
  • Fix: passes api_base, api_key, litellm_params_dict to custom_llm embedding methods by @ElefHead in #11450
  • Add Admin-Initiated Password Reset Flow by @NANDINI-star in #11618

New Contributors

Full Changelog: v1.72.4-nightly...v1.72.5.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.5.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 271.77221555459084 6.153062151618842 0.0 1841 0 218.69335899998532 1399.0517459999978
Aggregated Passed ✅ 250.0 271.77221555459084 6.153062151618842 0.0 1841 0 218.69335899998532 1399.0517459999978