Skip to content

Conversation

@matysek
Copy link
Contributor

@matysek matysek commented Aug 20, 2025

Description

LCORE-576: Add sentence-transformers dependency for llama-stack library mode

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up service version
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change
  • Unit tests improvement
  • Integration tests improvement
  • End to end tests improvement

Related Tickets & Documents

  • Related Issue #
  • Closes #

Checklist before requesting a review

  • I have performed a self-review of my code.
  • PR has passed all pre-merge test jobs.
  • If it is a core feature, I have added thorough tests.

Testing

  • Please provide detailed steps to perform tests related to this code change.
  • How were the fix/results from this change verified? Please provide relevant screenshots or results.

Summary by CodeRabbit

  • Chores
    • Added sentence-transformers dependency (API inference).
    • Introduced new dependency category comments to improve organization (e.g., API post-training, Other).
    • Reordered several dependencies for clearer grouping; no version changes to unaffected packages.
    • No changes to public interfaces or runtime behavior expected.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Aug 20, 2025

Walkthrough

Updated pyproject.toml to add a new dependency, adjust ordering of existing dependencies, and insert new category comments. No changes to exported/public APIs.

Changes

Cohort / File(s) Summary
Dependency management
pyproject.toml
- Added sentence-transformers>=5.0.0 under llslibdev with comment “API inference: inline::sentence-transformers”.
- Inserted category comment before torch==2.7.1: “API post_training: inline::huggingface”.
- Reordered: trl>=0.18.2 now precedes peft>=0.15.2; autoevals>=0.0.129 moved after peft under a new “Other” comment.
- No version changes to existing deps aside from the new addition.

Sequence Diagram(s)

(omitted)

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~8 minutes

Poem

A rabbit taps dependencies in neat rows,
Adds Transformers for sentences—on it goes.
Comments hop to label torch’s lane,
Peft and Trl swap spots again.
All tidy now in TOML’s glade—
I thump approval: changes made! 🐇✨

Tip

🔌 Remote MCP (Model Context Protocol) integration is now available!

Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats.

✨ Finishing Touches
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR/Issue comments)

Type @coderabbitai help to get the list of available commands.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Status, Documentation and Community

  • Visit our Status Page to check the current availability of CodeRabbit.
  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (2)
pyproject.toml (2)

137-137: Reordering peft after trl is fine; consider adding accelerate for post-training flows

The reorder is harmless and arguably clearer grouped with TRL. Many TRL/PEFT workflows also expect accelerate; if you rely on trainer utilities, consider adding it to this group.

Proposed addition (if needed by your usage):

     "trl>=0.18.2",
     "peft>=0.15.2",
+    "accelerate>=0.33.0",

116-117: Add upper bound to sentence-transformers dependency

Quick compatibility check results:

  • PyPI has v5.x (latest 5.1.0).
  • sentence-transformers requires transformers>=4.41.0,<5.0.0 and torch>=1.11.0.
  • Our specs (transformers>=4.34.0, torch==2.7.1) are compatible.

Please update pyproject.toml:

-    "sentence-transformers>=5.0.0",
+    "sentence-transformers>=5.0.0,<6",
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between c97daaf and fa2266d.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (1)
  • pyproject.toml (2 hunks)
🔇 Additional comments (2)
pyproject.toml (2)

134-134: Helpful category comment — consistent with existing taxonomy

The “API post_training: inline::huggingface” comment improves readability and aligns with other category tags. LGTM.


138-140: autoevals compatibility verified

autoevals>=0.0.129 declares requires_python >=3.8.0, which encompasses Python 3.12. No changes needed.

Copy link
Contributor

@tisnik tisnik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@tisnik tisnik merged commit d492c2b into lightspeed-core:main Aug 20, 2025
18 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants