Skip to content

Releases: BerriAI/litellm

v1.40.4

06 Jun 05:17
Compare
Choose a tag to compare

What's Changed

  • feat: clarify slack alerting message by @nibalizer in #4023
  • [Admin UI] Analytics - fix div by 0 error on /model/metrics by @ishaan-jaff in #4021
  • Use DEBUG level for curl command logging by @grav in #2980
  • feat(create_user_button.tsx): allow admin to invite user to proxy via user-email/pwd invite-links by @krrishdholakia in #4028
  • [FIX] Proxy redirect to PROXY_BASE_URL/ui after logging in by @ishaan-jaff in #4027
  • [Feat] Audit Logs for Key, User, ProxyModel CRUD operations by @ishaan-jaff in #4030

New Contributors

Full Changelog: v1.40.3...v1.40.4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 74 89.43947919222931 6.450062450815326 0.0 1930 0 64.37952199996744 1143.0389689999743
Aggregated Passed ✅ 74 89.43947919222931 6.450062450815326 0.0 1930 0 64.37952199996744 1143.0389689999743

v1.40.3-stable

05 Jun 19:41
4b3b1e0
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.40.3...v1.40.3-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.3-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 140.0 166.81647102860174 6.3100225495221665 0.0 1888 0 109.54055500008053 2288.330084999984
Aggregated Passed ✅ 140.0 166.81647102860174 6.3100225495221665 0.0 1888 0 109.54055500008053 2288.330084999984

v1.40.3

05 Jun 18:30
d22b0a8
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.40.2...v1.40.3

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.3

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 130.0 168.35103872813087 6.385058663866248 0.0 1909 0 109.50845100001061 8353.559378
Aggregated Passed ✅ 130.0 168.35103872813087 6.385058663866248 0.0 1909 0 109.50845100001061 8353.559378

v1.40.2-stable

05 Jun 16:40
Compare
Choose a tag to compare

Full Changelog: v1.40.1.dev4...v1.40.2-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.2-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 100.0 135.25610868094057 6.399866394760457 0.0 1915 0 82.61822200000779 2219.8920350000435
Aggregated Passed ✅ 100.0 135.25610868094057 6.399866394760457 0.0 1915 0 82.61822200000779 2219.8920350000435

v1.40.2

05 Jun 05:54
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.40.1...v1.40.2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 72 86.0339053382131 6.392727588765549 0.0 1913 0 61.2748209999836 896.4834699999642
Aggregated Passed ✅ 72 86.0339053382131 6.392727588765549 0.0 1913 0 61.2748209999836 896.4834699999642

v1.40.1.dev4

05 Jun 04:52
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.40.1...v1.40.1.dev4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.1.dev4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 110.0 130.49834083376624 6.432223242582805 0.0 1925 0 92.76206099997353 2155.1117690000297
Aggregated Passed ✅ 110.0 130.49834083376624 6.432223242582805 0.0 1925 0 92.76206099997353 2155.1117690000297

v1.40.1.dev2

04 Jun 16:56
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.40.1...v1.40.1.dev2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.1.dev2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 140.0 177.0382996107586 6.334561220731733 0.0 1896 0 114.13910500004931 1784.0317350000134
Aggregated Passed ✅ 140.0 177.0382996107586 6.334561220731733 0.0 1896 0 114.13910500004931 1784.0317350000134

v1.40.1

04 Jun 15:42
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.40.0...v1.40.1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 120.0 139.78250550967104 6.395300383667639 0.0 1913 0 95.28932899991105 1526.2213239999483
Aggregated Passed ✅ 120.0 139.78250550967104 6.395300383667639 0.0 1913 0 95.28932899991105 1526.2213239999483

v1.40.0

02 Jun 00:25
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.39.6...v1.40.0

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.0

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 120.0 133.63252197830545 6.467733658247951 0.0 1936 0 94.77090299998281 801.180971000008
Aggregated Passed ✅ 120.0 133.63252197830545 6.467733658247951 0.0 1936 0 94.77090299998281 801.180971000008

v1.39.6

01 Jun 04:21
Compare
Choose a tag to compare

We're launching team member invites (No SSO Required) on v1.39.6 🔥 Invite team member to view LLM Usage, Spend per service https://docs.litellm.ai/docs/proxy/ui

👍 [Fix] Cache Vertex AI clients - Major Perf improvement for VertexAI models

✨ Feat - Send new users invite emails on creation (using 'send_invite_email' on /user/new)

💻 UI - allow users to sign in with with email/password

🔓 [UI] Admin UI Invite Links for non SSO

✨ PR - [FEAT] Perf improvements - litellm.completion / litellm.acompletion - Cache OpenAI client
inviting_members_ui

What's Changed

New Contributors

Full Changelog: v1.39.5...v1.39.6

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.39.6

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 78 90.37559010674164 6.5521693586672445 0.0 1958 0 65.34477100001368 961.3953589999937
Aggregated Passed ✅ 78 90.37559010674164 6.5521693586672445 0.0 1958 0 65.34477100001368 961.3953589999937