-
Notifications
You must be signed in to change notification settings - Fork 55
LCORE-695: bump-up Llama Stack to version 0.2.22 #611
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LCORE-695: bump-up Llama Stack to version 0.2.22 #611
Conversation
WalkthroughUpdated Llama Stack dependency pins in Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (2)
✅ Files skipped from review due to trivial changes (1)
🧰 Additional context used📓 Path-based instructions (1)tests/e2e/features/**/*.feature📄 CodeRabbit inference engine (CLAUDE.md)
Files:
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
🔇 Additional comments (1)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (1)
pyproject.toml(1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
pyproject.toml
📄 CodeRabbit inference engine (CLAUDE.md)
pyproject.toml: ALWAYS check pyproject.toml for existing dependencies before adding new ones
ALWAYS verify current library versions in pyproject.toml rather than assuming versions
Prefer reading supported Python versions and tool configs from pyproject.toml
Files:
pyproject.toml
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: build-pr
- GitHub Check: e2e_tests
| "llama-stack==0.2.22", | ||
| "llama-stack-client==0.2.22", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Confirm availability of 0.2.22 artifacts.
Pin looks fine, but please double-check that both llama-stack and llama-stack-client 0.2.22 are published and installable to avoid broken builds.
You can verify straight from PyPI with:
🏁 Script executed:
#!/bin/bash
set -euo pipefail
for pkg in llama-stack llama-stack-client; do
echo "Checking $pkg…"
uv pip index versions "$pkg" | head -n 20
doneLength of output: 62
🏁 Script executed:
#!/bin/bash
set -euo pipefail
for pkg in llama-stack llama-stack-client; do
echo "Checking $pkg…"
python3 -m pip index versions "$pkg" 2>&1 | head -n20
doneLength of output: 1176
Correct dependency versions: llama-stack==0.2.22 and llama-stack-client==0.2.22 are not available on PyPI (latest is 0.2.12). Update to existing versions or publish the 0.2.22 release.
🤖 Prompt for AI Agents
In pyproject.toml around lines 31-32 the dependencies "llama-stack==0.2.22" and
"llama-stack-client==0.2.22" reference a non-existent PyPI release; change them
to an existing published version (e.g. "llama-stack==0.2.12" and
"llama-stack-client==0.2.12" or use a compatible range like "^0.2.12"), then run
your dependency manager (poetry lock/install or pip) to verify resolution and
update the lockfile accordingly.
11a42e7 to
1c5c862
Compare
Description
LCORE-695: bump-up Llama Stack to version 0.2.22
Type of change
Related Tickets & Documents
Summary by CodeRabbit