-
Notifications
You must be signed in to change notification settings - Fork 55
LCORE-380: Lock Llama-stack client to version 0.2.14 #297
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LCORE-380: Lock Llama-stack client to version 0.2.14 #297
Conversation
WalkthroughThe Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~2 minutes Possibly related PRs
Poem
✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
pyproject.toml (1)
14-14: Pinning to an exact version locks out patch-level fixesAdding an exact pin (
==0.2.14) guarantees reproducible installs, but it also prevents you from receiving any backward-compatible bug-fix releases (e.g.0.2.15). Unless there is a known regression in later patches, consider loosening the constraint to a compatible-release specifier:- "llama-stack-client==0.2.14", + "llama-stack-client~=0.2.14",If you must stay fully locked, verify that
0.2.14is published for all target platforms and that it supports Python 3.12–3.13, matching the project’srequires-pythonrange.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (1)
pyproject.toml(1 hunks)
Description
LCORE-380: Lock Llama-stack client to version 0.2.14
Type of change
Related Tickets & Documents
Summary by CodeRabbit