Skip to content

Conversation

@tisnik
Copy link
Contributor

@tisnik tisnik commented Sep 16, 2025

Description

LCORE-634: bump up Llama Stack version to 0.2.20

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up service version
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change
  • Unit tests improvement
  • Integration tests improvement
  • End to end tests improvement

Related Tickets & Documents

  • Related Issue #LCORE-634

Summary by CodeRabbit

  • Chores
    • Updated Llama Stack dependencies to the latest patch release, ensuring compatibility with version 0.2.20.
    • Relaxed the OpenAI SDK requirement to allow versions >=1.99.9, enabling smoother updates and broader compatibility with newer SDK releases.
    • Aligned internal version constraints to reflect the newly supported Llama Stack version.
    • No functional changes expected; these updates focus on maintaining compatibility and reducing upgrade friction.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Sep 16, 2025

Walkthrough

Bumps llama-stack and llama-stack-client from 0.2.19 to 0.2.20 in pyproject.toml, relaxes openai version to a minimum constraint, and updates MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION to "0.2.20" in src/constants.py.

Changes

Cohort / File(s) Summary
Dependency version updates
pyproject.toml
Updated llama-stack and llama-stack-client from 0.2.19 to 0.2.20. Relaxed openai constraint from ==1.99.9 to >=1.99.9.
Constants alignment
src/constants.py
Set MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION from "0.2.19" to "0.2.20".

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~3 minutes

Possibly related PRs

Suggested reviewers

  • umago

Poem

I nudge my nose at version snow,
Patch hops forward, steady flow.
Stacks align, constants chime,
Open gates for future time.
Tip-tap paws sign off the log—
0.2.20, cleared the fog. 🐇✨

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The title "LCORE-634: bump up Llama Stack version to 0.2.20" is a concise, single-sentence summary that directly describes the primary change (bumping the Llama Stack dependency to 0.2.20) and includes the ticket reference, matching the modifications in pyproject.toml and src/constants.py.
Docstring Coverage ✅ Passed No functions found in the changes. Docstring coverage check skipped.
✨ Finishing touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between d8025ac and a8ccb47.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (2)
  • pyproject.toml (2 hunks)
  • src/constants.py (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: e2e_tests
🔇 Additional comments (2)
src/constants.py (1)

5-5: Confirm 0.2.20 client is backward-compatible with servers 0.2.17–0.2.19 or raise MINIMAL_SUPPORTED_LLAMA_STACK_VERSION to 0.2.20.

Definitions: src/constants.py (lines 4–5). Enforcement: src/utils/llama_stack_version.py (compare_versions). Tests referencing these: tests/unit/utils/test_llama_stack_version.py.

pyproject.toml (1)

31-32: Pinned llama-stack 0.2.20 not found upstream — verify and adjust

pyproject.toml (lines 31–32): "llama-stack==0.2.20", "llama-stack-client==0.2.20". GitHub releases show no v0.2.20; latest public tag is v0.2.18 (Aug 20, 2025). Confirm whether 0.2.20 is a private/pre-release or a typo. If unpublished, either pin to 0.2.18 or document the private source and add a follow-up to review 0.2.20 release notes and run tests once it is published.

# OpenAPI exporter
"email-validator>=2.2.0",
"openai==1.99.9",
"openai>=1.99.9",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Unbounded OpenAI dep can break builds; add an upper bound.

Use a safe range to avoid accidental 2.x upgrades and ensure reproducibility.

Apply:

-    "openai>=1.99.9",
+    "openai>=1.99.9,<2.0.0",
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"openai>=1.99.9",
"openai>=1.99.9,<2.0.0",
🤖 Prompt for AI Agents
In pyproject.toml around line 46, the OpenAI dependency is unbounded on the
upper side which can allow accidental 2.x upgrades; update the requirement to
include an explicit upper bound (for example constrain it to be <2.0.0) so the
spec reads something like openai>=1.99.9,<2.0.0, then regenerate/update your
lockfile (poetry.lock or equivalent) to ensure reproducible installs.

@tisnik tisnik closed this Sep 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant