Skip to content

Releases: BerriAI/litellm

v1.72.2.devMCP

13 Jun 23:10
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.72.5.dev1...v1.72.2.devMCP

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.2.devMCP

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 220.0 241.96992280403583 6.294425384311064 0.0 1883 0 199.48631400001204 1258.8171310000007
Aggregated Passed ✅ 220.0 241.96992280403583 6.294425384311064 0.0 1883 0 199.48631400001204 1258.8171310000007

v1.72.5.dev1

11 Jun 17:59
Compare
Choose a tag to compare

What's Changed

  • fix(internal_user_endpoints.py): support user with + in email on user info + handle empty string for arguments on gemini function calls by @krrishdholakia in #11601
  • Fix: passes api_base, api_key, litellm_params_dict to custom_llm embedding methods by @ElefHead in #11450
  • Add Admin-Initiated Password Reset Flow by @NANDINI-star in #11618

New Contributors

Full Changelog: v1.72.4-nightly...v1.72.5.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.5.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 271.77221555459084 6.153062151618842 0.0 1841 0 218.69335899998532 1399.0517459999978
Aggregated Passed ✅ 250.0 271.77221555459084 6.153062151618842 0.0 1841 0 218.69335899998532 1399.0517459999978

v1.72.4-nightly

11 Jun 06:36
3b7f1d5
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.72.3-nightly...v1.72.4-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.4-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 180.0 204.8932724751626 6.202890810178717 0.0 1852 0 168.13937000000578 1311.1876840000036
Aggregated Passed ✅ 180.0 204.8932724751626 6.202890810178717 0.0 1852 0 168.13937000000578 1311.1876840000036

v1.72.2-stable

12 Jun 00:47
Compare
Choose a tag to compare

Full Changelog: v1.72.0.stable...v1.72.2-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.72.2-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 200.0 220.70987704382637 6.253574765303403 0.0 1871 0 179.76551599997492 1345.8777239999904
Aggregated Passed ✅ 200.0 220.70987704382637 6.253574765303403 0.0 1871 0 179.76551599997492 1345.8777239999904

What's Changed

Read more

v1.72.3-nightly

10 Jun 22:21
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.72.2.rc...v1.72.3-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.3-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 244.23009797848223 6.212339526915134 0.0 1859 0 195.39837500002477 1308.959112999986
Aggregated Passed ✅ 230.0 244.23009797848223 6.212339526915134 0.0 1859 0 195.39837500002477 1308.959112999986

v1.72.2.rc

08 Jun 02:02
Compare
Choose a tag to compare

What's Changed

New Contributors

Read more

v1.72.2-nightly

07 Jun 06:04
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.72.1.dev8...v1.72.2-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.2-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 180.0 201.25992693793458 6.245318195564 0.0 1869 0 165.1556739999478 1316.0002060000124
Aggregated Passed ✅ 180.0 201.25992693793458 6.245318195564 0.0 1869 0 165.1556739999478 1316.0002060000124

v1.72.0-stable

07 Jun 04:51
Compare
Choose a tag to compare

Full Changelog: v1.72.0.rc1...v1.72.0-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.72.0-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 180.0 202.72874892497356 6.248285605273851 0.0 1866 0 166.82234500001414 1400.0550900000235
Aggregated Passed ✅ 180.0 202.72874892497356 6.248285605273851 0.0 1866 0 166.82234500001414 1400.0550900000235

v1.72.2.dev_image

06 Jun 15:39
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.72.1.dev8...v1.72.2.dev_image

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.2.dev_image

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 266.1860423928771 6.200527110342324 0.0 1853 0 215.66505500004496 1307.9809779999891
Aggregated Passed ✅ 250.0 266.1860423928771 6.200527110342324 0.0 1853 0 215.66505500004496 1307.9809779999891

v1.72.1.dev8

06 Jun 05:12
c99daef
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.72.1-nightly...v1.72.1.dev8

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.72.1.dev8

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 180.0 202.7306475996812 6.284886409238797 0.0 1881 0 164.28741799995805 1311.9080620000432
Aggregated Passed ✅ 180.0 202.7306475996812 6.284886409238797 0.0 1881 0 164.28741799995805 1311.9080620000432