Skip to content

Releases: BerriAI/litellm

v1.41.11.dev1

08 Jul 03:30
49d7faa
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.41.8.dev2...v1.41.11.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.11.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 77 88.81311274450754 6.537049007999015 0.0 1957 0 67.48113100002229 1314.0082060000395
Aggregated Passed ✅ 77 88.81311274450754 6.537049007999015 0.0 1957 0 67.48113100002229 1314.0082060000395

v1.41.11

07 Jul 01:06
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.41.8...v1.41.11

v1.41.8.dev2

07 Jul 01:07
Compare
Choose a tag to compare

Full Changelog: v1.41.11...v1.41.8.dev2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.8.dev2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.8.dev2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 130.0 145.86702940844324 6.331699407694199 0.0 1895 0 106.76047199996219 387.6153609999733
Aggregated Passed ✅ 130.0 145.86702940844324 6.331699407694199 0.0 1895 0 106.76047199996219 387.6153609999733

v1.41.8.dev1

07 Jul 00:40
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.41.8...v1.41.8.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.8.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 77 87.51172147931786 6.46157682222913 0.0 1934 0 67.4040659999946 582.6242449999768
Aggregated Passed ✅ 77 87.51172147931786 6.46157682222913 0.0 1934 0 67.4040659999946 582.6242449999768

v1.41.8

06 Jul 04:08
Compare
Choose a tag to compare

🔥 Excited to launch support for Logging LLM I/O on 🔭 Galileo through LiteLLM (YC W23) Proxy https://docs.litellm.ai/docs/proxy/logging#logging-llm-io-to-galielo

📈 [docs] New example Grafana Dashboards https://github.com/BerriAI/litellm/tree/main/cookbook/litellm_proxy_server/grafana_dashboard

🛡️ feat - control guardrails per api key https://docs.litellm.ai/docs/proxy/guardrails#switch-guardrails-onoff-per-api-key

🛠️ fix - raise report Anthropic streaming errors (thanks David Manouchehri)

✨ [Fix] Add nvidia nim param mapping based on model passed

Group 5879

What's Changed

New Contributors

Full Changelog: v1.41.7...v1.41.8

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.8

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 120.0 148.48763956993193 6.382118352365276 0.0 1909 0 109.10986900000808 1689.413720999994
Aggregated Passed ✅ 120.0 148.48763956993193 6.382118352365276 0.0 1909 0 109.10986900000808 1689.413720999994

v1.41.7

05 Jul 05:01
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.41.6...v1.41.7

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.7

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 130.0 152.06898919521237 6.419721686734246 0.0 1921 0 111.60093299997698 1678.7594189999027
Aggregated Passed ✅ 130.0 152.06898919521237 6.419721686734246 0.0 1921 0 111.60093299997698 1678.7594189999027

v1.41.6

04 Jul 06:48
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.41.5...v1.41.6

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.6

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.6

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 99 120.03623455821207 6.427899106681398 0.0 1924 0 83.30082600002697 1524.837892999983
Aggregated Passed ✅ 99 120.03623455821207 6.427899106681398 0.0 1924 0 83.30082600002697 1524.837892999983

v1.41.5.dev1

04 Jul 05:21
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.41.5...v1.41.5.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.5.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 130.0 150.69639806536688 6.33818146467335 0.0 1897 0 115.55820499995662 1375.4738929999917
Aggregated Passed ✅ 130.0 150.69639806536688 6.33818146467335 0.0 1897 0 115.55820499995662 1375.4738929999917

v1.41.5

04 Jul 03:24
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.41.4...v1.41.5

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.5

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 140.0 158.84378552886133 6.424808718348793 0.030069307574175322 1923 9 83.97341500000266 2746.2116009999704
Aggregated Passed ✅ 140.0 158.84378552886133 6.424808718348793 0.030069307574175322 1923 9 83.97341500000266 2746.2116009999704

v1.41.4.dev1

03 Jul 19:38
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.41.4...v1.41.4.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.4.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.4.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 140.0 159.74933266389294 6.421967444124719 0.0 1922 0 117.16217900004722 1925.8788660000619
Aggregated Passed ✅ 140.0 159.74933266389294 6.421967444124719 0.0 1922 0 117.16217900004722 1925.8788660000619