-
-
Notifications
You must be signed in to change notification settings - Fork 3k
Issues: BerriAI/litellm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Feature]: Support Grok New feature or request
search_parameters
param
enhancement
#11030
opened May 21, 2025 by
colesmcintosh
[Feature]: Allow to configure database host from secrets
enhancement
New feature or request
mlops user request
#11025
opened May 21, 2025 by
hpedrorodrigues
[Feature]: Allow to configure a different port for service in Helm chart
enhancement
New feature or request
mlops user request
#11024
opened May 21, 2025 by
hpedrorodrigues
[Feature]: Support Modal Proxy Authentication
enhancement
New feature or request
#11021
opened May 21, 2025 by
stevef1uk
[Bug]: AWS Sagemaker embedding calls are failing with a Jina endpoint
bug
Something isn't working
#11019
opened May 21, 2025 by
keyzou
[Bug]: Gemini 2.5 Pro models Cost evalutaion is incorrect
bug
Something isn't working
#11017
opened May 21, 2025 by
daarko10
[Feature]: MCP Crud DB Operations
enhancement
New feature or request
#11010
opened May 21, 2025 by
wagnerjt
[Bug]: Security checks too strict for custom handlers
bug
Something isn't working
mlops user request
#11009
opened May 21, 2025 by
JohnPaton
[Bug]: max_token is ignored when invoking a Fireworks AI model from the LiteLLM Proxy Server
bug
Something isn't working
mlops user request
#11002
opened May 21, 2025 by
AlessandroMondin
Groq not working well with the Litellm definition of tools.
#11001
opened May 21, 2025 by
jeroenvermunt
[Bug]: Gemini gemini-2.5-flash-preview-05-20 stream mode concatenates reasoning data with main content due to missing thought/reasoning chunk identification
bug
Something isn't working
#11000
opened May 21, 2025 by
fengjiajie
[Bug]: Invitation Email is failing with an error
bug
Something isn't working
#10999
opened May 21, 2025 by
mknet3
[Bug]: Timeout message included a traceback
bug
Something isn't working
#10997
opened May 21, 2025 by
marcus09310
[Feature]: Does litellm support redis sentinel for router_settings?
enhancement
New feature or request
#10994
opened May 21, 2025 by
KyleZhang0536
[Bug]: Request truncated in spend logs
bug
Something isn't working
#10988
opened May 20, 2025 by
mrexodia
[Bug]: Upgrade to 0.70.1-stable deleted my usage history
bug
Something isn't working
#10976
opened May 20, 2025 by
Otts86
[Bug]: Max Budget reached also blocks Zero-Cost-Models
bug
Something isn't working
#10973
opened May 20, 2025 by
andigehle
[Bug]: Fallback models dont work for CustomLLM on streaming endpoints
bug
Something isn't working
mlops user request
#10972
opened May 20, 2025 by
hahamark1
[Bug]: PII Masking with bedrock
bug
Something isn't working
mlops user request
#10971
opened May 20, 2025 by
superpoussin22
[Bug]: Bug about model management
bug
Something isn't working
mlops user request
#10967
opened May 20, 2025 by
unrealandychan
[Feature]: Improve OpenAPI schema
enhancement
New feature or request
#10954
opened May 19, 2025 by
mickvangelderen
[Bug]: When calling remote Ollama model, LiteLLM tries to access localhost for cost calculation
bug
Something isn't working
#10952
opened May 19, 2025 by
FlorianVal
Previous Next
ProTip!
Adding no:label will show everything without a label.