Skip to content

Releases: BerriAI/litellm

v1.74.4-nightly

17 Jul 03:38
b2080ec
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.74.3.rc.1...v1.74.4-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.74.4-nightly

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed βœ… 240.0 258.63325774080073 6.1786141049802525 0.0 1848 0 211.92541800002118 1368.992559999981
Aggregated Passed βœ… 240.0 258.63325774080073 6.1786141049802525 0.0 1848 0 211.92541800002118 1368.992559999981

v1.74.3.rc.3

16 Jul 15:52
Compare
Choose a tag to compare

v1.74.3.dev2

16 Jul 15:53
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.74.3.rc.1...v1.74.3.dev2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.74.3.dev2

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.74.3.dev2

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed βœ… 190.0 206.89425634609142 6.19933434941609 0.0 1855 0 168.97698900004343 1646.9904610000299
Aggregated Passed βœ… 190.0 206.89425634609142 6.19933434941609 0.0 1855 0 168.97698900004343 1646.9904610000299

v1.74.3.rc.2

13 Jul 02:27
Compare
Choose a tag to compare

Full Changelog: v1.74.3.rc.1...v1.74.3.rc.2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.74.3.rc.2

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed βœ… 240.0 261.64451627750555 6.164255932341009 0.0 1845 0 216.88029500000994 1499.7778569999696
Aggregated Passed βœ… 240.0 261.64451627750555 6.164255932341009 0.0 1845 0 216.88029500000994 1499.7778569999696

v1.74.3.rc.1

13 Jul 01:08
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.74.3-stable-draft...v1.74.3.rc.1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.74.3.rc.1

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed βœ… 230.0 255.52282259223816 6.28594793060673 0.0 1881 0 206.60606100000223 1164.8403710000252
Aggregated Passed βœ… 230.0 255.52282259223816 6.28594793060673 0.0 1881 0 206.60606100000223 1164.8403710000252

v1.74.3-nightly

12 Jul 20:59
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.74.2-nightly...v1.74.3-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.74.3-nightly

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 210.0 397.6239629079091 5.915434271857456 0.0 1770 0 185.11648200001218 16174.709219000022
Aggregated Failed ❌ 210.0 397.6239629079091 5.915434271857456 0.0 1770 0 185.11648200001218 16174.709219000022

v1.74.3-stable-draft

12 Jul 21:14
d8ae044
Compare
Choose a tag to compare
v1.74.3-stable-draft Pre-release
Pre-release

What's Changed

Read more

v1.74.2-nightly

11 Jul 04:23
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.74.1-nightly...v1.74.2-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.74.2-nightly

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed βœ… 200.0 216.6222827699786 6.190122373161466 0.0 1852 0 172.40756000001056 1119.208724000032
Aggregated Passed βœ… 200.0 216.6222827699786 6.190122373161466 0.0 1852 0 172.40756000001056 1119.208724000032

v1.74.1-nightly

10 Jul 16:25
07e8609
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.74.0-nightly...v1.74.1-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.74.1-nightly

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed βœ… 200.0 271.9479641086367 6.15189964037961 0.0033416076264962576 1841 1 168.73124300002473 2931.748561000063
Aggregated Passed βœ… 200.0 271.9479641086367 6.15189964037961 0.0033416076264962576 1841 1 168.73124300002473 2931.748561000063

v1.74.0-stable

10 Jul 22:38
Compare
Choose a tag to compare

Full Changelog: v1.74.0.rc.2...v1.74.0-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.74.0-stable

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed βœ… 240.0 260.8232618687037 6.237602465744322 0.0 1866 0 210.26094900003045 1392.1758870000076
Aggregated Passed βœ… 240.0 260.8232618687037 6.237602465744322 0.0 1866 0 210.26094900003045 1392.1758870000076