LCORE-1715: Fixes in LiteLLM package#1466
Conversation
WalkthroughThe Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~2 minutes 🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
✨ Simplify code
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@pyproject.toml`:
- Line 61: The pyproject.toml bump to litellm==1.83.0 wasn’t propagated to
requirements.hashes.source.txt; find the existing entry "litellm==1.82.6" in
requirements.hashes.source.txt (around the current litellm block) and update it
to "litellm==1.83.0" and replace its accompanying hash(es) with the correct hash
values for 1.83.0 (regenerate or copy the official wheel/source hashes), then
save so builds using requirements.hashes.source.txt install the updated litellm.
- Line 61: The upgrade to litellm==1.83.0 introduces a behavioral change
requiring the proxy_admin role for the /config/update endpoint; search for any
code paths, scheduled jobs, integration tests, or deployment scripts that call
/config/update (or use functions/methods that proxy to it) and confirm they
either run as proxy_admin or are updated to use an admin credential to avoid
post-upgrade authorization failures; also update documentation/config or add
migrations as needed to ensure callers have proxy_admin. Finally, clarify the
ticket reference by confirming whether this change maps to LCORE-1565 or
LCORE-1715 and update the PR title/body to reference the correct issue ID so the
changelog and reviewers are consistent.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 53e402bf-0a27-4b21-823f-6b6e0cbaf017
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (1)
pyproject.toml
📜 Review details
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
- GitHub Check: build-pr
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E Tests for Lightspeed Evaluation job
🧰 Additional context used
🧠 Learnings (2)
📓 Common learnings
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:43-45
Timestamp: 2025-08-18T10:55:18.914Z
Learning: The lightspeed-stack project updates dependencies every sprint as part of their regular maintenance cycle, which explains their preference for exact dependency pins rather than version ranges.
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-04-05T12:19:36.009Z
Learning: Always verify current library versions in `pyproject.toml` rather than assuming versions
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.
Applied to files:
pyproject.toml
🔇 Additional comments (1)
pyproject.toml (1)
185-185: LGTM: Consistent version pinning across dependency groups.The litellm version is now consistently pinned to 1.83.0 in both the main dependencies and the llslibdev group, which prevents version conflicts.
| "psycopg2-binary>=2.9.10", | ||
| # LCORE-1565 | ||
| "litellm>=1.75.5.post1,<=1.82.6", | ||
| "litellm==1.83.0", |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Description: Verify the litellm version in requirements.hashes.source.txt
echo "=== Checking litellm version in requirements.hashes.source.txt ==="
rg -n "^litellm==" requirements.hashes.source.txt
echo -e "\n=== Checking if other lock files need updates ==="
fd -e txt -e lock . -x rg -l "litellm.*1\.82\.6" {}Repository: lightspeed-core/lightspeed-stack
Length of output: 248
Update requirements.hashes.source.txt to match the litellm version change.
The update to litellm 1.83.0 in pyproject.toml is incomplete. While pyproject.toml now pins litellm==1.83.0, the file requirements.hashes.source.txt at line 509 still contains litellm==1.82.6.
Builds that use requirements.hashes.source.txt will install the outdated version, preventing the security fix from being applied. Update the file to reference litellm==1.83.0 with the correct hash.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@pyproject.toml` at line 61, The pyproject.toml bump to litellm==1.83.0 wasn’t
propagated to requirements.hashes.source.txt; find the existing entry
"litellm==1.82.6" in requirements.hashes.source.txt (around the current litellm
block) and update it to "litellm==1.83.0" and replace its accompanying hash(es)
with the correct hash values for 1.83.0 (regenerate or copy the official
wheel/source hashes), then save so builds using requirements.hashes.source.txt
install the updated litellm.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Description: Verify litellm 1.83.0 exists and check for security advisories
echo "=== Checking PyPI for litellm 1.83.0 ==="
curl -s https://pypi.org/pypi/litellm/1.83.0/json | jq -r '.info.version // "Version not found"'
echo -e "\n=== Checking for security advisories for litellm ==="
gh api graphql -f query='
{
securityVulnerabilities(first: 10, ecosystem: PIP, package: "litellm") {
nodes {
advisory {
summary
severity
publishedAt
}
vulnerableVersionRange
firstPatchedVersion {
identifier
}
}
}
}'
echo -e "\n=== Checking litellm changelog for breaking changes ==="
curl -s https://pypi.org/pypi/litellm/json | jq -r '.releases["1.83.0"][0].upload_time_iso_8601 // "Upload time not found"'Repository: lightspeed-core/lightspeed-stack
Length of output: 422
🌐 Web query:
What CVEs or security vulnerabilities were fixed in litellm version 1.83.0? Are there any breaking changes between litellm 1.82.6 and 1.83.0?
💡 Result:
CVEs / security vulnerabilities fixed in litellm==1.83.0
-
CVE-2026-35029 (GHSA-53mr-6c8q-9789) – Privilege escalation via unrestricted proxy configuration endpoint
The/config/updateendpoint did not enforce admin authorization; an authenticated user could modify config/env vars and potentially reach RCE paths. Fixed in v1.83.0 by requiring theproxy_adminrole. [1] -
CVE-2026-35030 (GHSA-jjhc-v7c2-5hh6) – Authentication bypass via OIDC userinfo cache key collision (when JWT auth is enabled)
Withenable_jwt_auth: true, the OIDC userinfo cache key usedtoken[:20], enabling crafted-token cache collisions and identity/permission takeover. Fixed in v1.83.0 by using the full hash of the JWT token. [2]
litellm 1.83.0 was released on Mar 31, 2026 (PyPI). [3]
Breaking changes from 1.82.6 → 1.83.0?
No “breaking changes” are explicitly called out in the upstream GitHub release notes around the 1.83.0 bump. [4]
However, the CVE-2026-35029 fix is a behavior change that can break setups that relied on non-proxy_admin authenticated users being able to call /config/update (it will now require proxy_admin). [1]
Sources
[1] GHSA-53mr-6c8q-9789
[2] GHSA-jjhc-v7c2-5hh6
[3] https://pypi.org/project/litellm/
[4] https://github.com/BerriAI/litellm/releases/tag/v1.83.0-nightly
Confirm behavioral change impact from CVE fix.
Version 1.83.0 is confirmed and fixes two security vulnerabilities (CVE-2026-35029 and CVE-2026-35030). However, the fix for CVE-2026-35029 introduces a behavioral change: the /config/update endpoint now requires the proxy_admin role, whereas it previously allowed any authenticated user access. Verify that no part of the codebase or deployment relies on this endpoint being accessible to non-admin authenticated users, as it will fail with authorization errors post-upgrade.
Additionally, the comment references LCORE-1565 but the PR title references LCORE-1715—clarify which ticket this fix addresses.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@pyproject.toml` at line 61, The upgrade to litellm==1.83.0 introduces a
behavioral change requiring the proxy_admin role for the /config/update
endpoint; search for any code paths, scheduled jobs, integration tests, or
deployment scripts that call /config/update (or use functions/methods that proxy
to it) and confirm they either run as proxy_admin or are updated to use an admin
credential to avoid post-upgrade authorization failures; also update
documentation/config or add migrations as needed to ensure callers have
proxy_admin. Finally, clarify the ticket reference by confirming whether this
change maps to LCORE-1565 or LCORE-1715 and update the PR title/body to
reference the correct issue ID so the changelog and reviewers are consistent.
Description
LCORE-1715: Fixes in LiteLLM package
Type of change
Tools used to create PR
Related Tickets & Documents
Summary by CodeRabbit
litellmdependency to version 1.83.0 for consistent builds