-
Notifications
You must be signed in to change notification settings - Fork 54
LCORE-576: Add sentence-transformers dependency for llama-stack library mode #427
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughUpdated pyproject.toml to add a new dependency, adjust ordering of existing dependencies, and insert new category comments. No changes to exported/public APIs. Changes
Sequence Diagram(s)(omitted) Estimated code review effort🎯 2 (Simple) | ⏱️ ~8 minutes Poem
Tip 🔌 Remote MCP (Model Context Protocol) integration is now available!Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats. ✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
pyproject.toml (2)
137-137: Reordering peft after trl is fine; consider adding accelerate for post-training flowsThe reorder is harmless and arguably clearer grouped with TRL. Many TRL/PEFT workflows also expect accelerate; if you rely on trainer utilities, consider adding it to this group.
Proposed addition (if needed by your usage):
"trl>=0.18.2", "peft>=0.15.2", + "accelerate>=0.33.0",
116-117: Add upper bound to sentence-transformers dependencyQuick compatibility check results:
- PyPI has v5.x (latest 5.1.0).
- sentence-transformers requires
transformers>=4.41.0,<5.0.0andtorch>=1.11.0.- Our specs (
transformers>=4.34.0,torch==2.7.1) are compatible.Please update
pyproject.toml:- "sentence-transformers>=5.0.0", + "sentence-transformers>=5.0.0,<6",
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (1)
pyproject.toml(2 hunks)
🔇 Additional comments (2)
pyproject.toml (2)
134-134: Helpful category comment — consistent with existing taxonomyThe “API post_training: inline::huggingface” comment improves readability and aligns with other category tags. LGTM.
138-140: autoevals compatibility verified
autoevals>=0.0.129declaresrequires_python >=3.8.0, which encompasses Python 3.12. No changes needed.
tisnik
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Description
LCORE-576: Add sentence-transformers dependency for llama-stack library mode
Type of change
Related Tickets & Documents
Checklist before requesting a review
Testing
Summary by CodeRabbit