Skip to content

Releases: BerriAI/litellm

v1.22.8

06 Feb 21:14
Compare
Choose a tag to compare

What's Changed

  • [Fix] UI - Security - Litellm UI Keys meant for litellm-dashboard shouldn't be allowed to make non-management related requests by @ishaan-jaff in #1836
  • Fix admin UI title and description by @ushuz in #1842
  • fix(langfuse.py): support logging failed llm api calls to langfuse by @krrishdholakia in #1837
  • [Feat] Proxy set upperbound params for key/generate by @ishaan-jaff in #1844
  • build(requirements.txt): update the proxy requirements.txt by @krrishdholakia in #1846

Full Changelog: v1.22.5...v1.22.8

v1.22.5

05 Feb 23:31
Compare
Choose a tag to compare

What's Changed

  • Re-raise exception in async ollama streaming by @vanpelt in #1750
  • Add a Helm chart for deploying LiteLLM Proxy by @ShaunMaher in #1602
  • Update Perplexity models in model_prices_and_context_window.json by @toniengelhardt in #1826
  • (feat) Add sessionId for Langfuse. by @Manouchehri in #1828
  • [Feat] Sync model_prices_and_context_window.json and litellm/model_prices_and_context_window_backup.json by @ishaan-jaff in #1834

New Contributors

Full Changelog: v1.22.3...v1.22.5

v1.22.3

04 Feb 06:46
Compare
Choose a tag to compare

What's Changed

  • feat(utils.py): support cost tracking for openai/azure image gen models by @krrishdholakia in #1805

Full Changelog: v1.22.2...v1.22.3

v1.22.2

04 Feb 05:24
7ee5d2e
Compare
Choose a tag to compare

Admin UI 🤠

  • view spend, budget for signed in user
  • view daily spend, top users for a key
    ui_3

What's Changed

Full Changelog: v1.21.7...v1.22.2

v1.21.7

03 Feb 05:23
Compare
Choose a tag to compare

Full Changelog: v1.21.6...v1.21.7

v1.21.6

03 Feb 04:22
6950f99
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.21.5...v1.21.6

v1.21.5

03 Feb 03:06
Compare
Choose a tag to compare

What's Changed

⭐️ [Feat] Show correct provider in exceptions - for Mistral API, PerplexityAPI, Anyscale, XInference by @ishaan-jaff in #1765, #1776

(Thanks @dhruv-anand-aintech for the issue/help)
Exceptions for Mistral API, PerplexityAPI, Anyscale, XInference now show the correct provider name, before they would show OPENAI_API_KEY is missing when using PerplexityAI

exception:  PerplexityException - Traceback (most recent call last):
  File "/Users/ishaanjaffer/Github/litellm/litellm/llms/perplexity.py", line 349, in completion
    raise e
  File "/Users/ishaanjaffer/Github/litellm/litellm/llms/perplexity.py", line 292, in completion
    perplexity_client = perplexity(
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/perplexity/_client.py", line 98, in __init__
    raise perplexityError(
perplexity.perplexityError: The api_key client option must be set either by passing api_key to the client or by setting the PERPLEXITY_API_KEY environment variable

Full Changelog: v1.21.4...v1.21.5

v1.21.4

02 Feb 22:56
Compare
Choose a tag to compare

What's Changed

Screenshot 2024-02-02 at 2 28 44 PM * s3 logger - Improve filename to allow easier sorting by @Manouchehri in https://github.com//pull/1766 * fix(utils.py): override default success callbacks with dynamic callbacks if set by @krrishdholakia in https://github.com//pull/1761 * Litellm security fix allow user auth by @krrishdholakia in https://github.com//pull/1781

Full Changelog: v1.21.1...v1.21.4

v1.21.1

02 Feb 16:38
Compare
Choose a tag to compare

What's Changed

  • Allow to specify user email created via /user/new by @scampion in #1759

Full Changelog: v1.21.0...v1.21.1

v1.21.0

02 Feb 06:15
7fc03bf
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.20.9...v1.21.0