Skip to content

Conversation

@tisnik
Copy link
Contributor

@tisnik tisnik commented Oct 2, 2025

Description

LCORE-695: bump-up Llama Stack to version 0.2.22

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up service version
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change
  • Unit tests improvement
  • Integration tests improvement
  • End to end tests improvement

Related Tickets & Documents

  • Related Issue #LCORE-695

Summary by CodeRabbit

  • Chores
    • Updated two underlying Llama Stack dependencies to version 0.2.22 for compatibility and stability.
    • Automated tests updated to reflect the new dependency version.
    • No user-facing functionality changes are expected; report any unexpected behavior so we can investigate upstream changes.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 2, 2025

Walkthrough

Updated Llama Stack dependency pins in pyproject.toml from 0.2.21 → 0.2.22, bumped MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION in src/constants.py, and adjusted the info endpoint test expectation in tests/e2e/features/info.feature accordingly. No other logic or API changes.

Changes

Cohort / File(s) Summary of modifications
Dependency version bump
pyproject.toml
Updated two Llama Stack dependency pins from 0.2.210.2.22.
Constants update
src/constants.py
Bumped MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION constant from "0.2.21""0.2.22".
Test expectation
tests/e2e/features/info.feature
Updated test expectation for the info endpoint to expect Llama Stack version 0.2.22 (was 0.2.21).

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Possibly related PRs

Suggested reviewers

  • umago

Poem

A rabbit hops in, pins all aligned,
Versions nudged forward, neat and refined.
No tangled code, no frantic ado,
Just a small bump—hello 0.2.22! 🥕🐇

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The title succinctly describes the primary change by indicating a bump of the Llama Stack dependency to version 0.2.22 and references the associated ticket. It is concise, specific, and accurately reflects the pull request’s content. There is no extraneous information or ambiguity in the phrasing.
Docstring Coverage ✅ Passed No functions found in the changes. Docstring coverage check skipped.
✨ Finishing touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 85d776b and 1c5c862.

📒 Files selected for processing (2)
  • src/constants.py (1 hunks)
  • tests/e2e/features/info.feature (1 hunks)
✅ Files skipped from review due to trivial changes (1)
  • src/constants.py
🧰 Additional context used
📓 Path-based instructions (1)
tests/e2e/features/**/*.feature

📄 CodeRabbit inference engine (CLAUDE.md)

Write E2E tests as Gherkin feature files for behave

Files:

  • tests/e2e/features/info.feature
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: build-pr
  • GitHub Check: e2e_tests
🔇 Additional comments (1)
tests/e2e/features/info.feature (1)

21-21: LGTM!

The test expectation has been correctly updated to match the bumped llama-stack version 0.2.22, aligning with the dependency update in pyproject.toml and the constant update in src/constants.py.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2898f8e and 85d776b.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (1)
  • pyproject.toml (1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
pyproject.toml

📄 CodeRabbit inference engine (CLAUDE.md)

pyproject.toml: ALWAYS check pyproject.toml for existing dependencies before adding new ones
ALWAYS verify current library versions in pyproject.toml rather than assuming versions
Prefer reading supported Python versions and tool configs from pyproject.toml

Files:

  • pyproject.toml
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: build-pr
  • GitHub Check: e2e_tests

Comment on lines +31 to +32
"llama-stack==0.2.22",
"llama-stack-client==0.2.22",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Confirm availability of 0.2.22 artifacts.

Pin looks fine, but please double-check that both llama-stack and llama-stack-client 0.2.22 are published and installable to avoid broken builds.

You can verify straight from PyPI with:


🏁 Script executed:

#!/bin/bash
set -euo pipefail

for pkg in llama-stack llama-stack-client; do
  echo "Checking $pkg"
  uv pip index versions "$pkg" | head -n 20
done

Length of output: 62


🏁 Script executed:

#!/bin/bash
set -euo pipefail

for pkg in llama-stack llama-stack-client; do
  echo "Checking $pkg"
  python3 -m pip index versions "$pkg" 2>&1 | head -n20
done

Length of output: 1176


Correct dependency versions: llama-stack==0.2.22 and llama-stack-client==0.2.22 are not available on PyPI (latest is 0.2.12). Update to existing versions or publish the 0.2.22 release.

🤖 Prompt for AI Agents
In pyproject.toml around lines 31-32 the dependencies "llama-stack==0.2.22" and
"llama-stack-client==0.2.22" reference a non-existent PyPI release; change them
to an existing published version (e.g. "llama-stack==0.2.12" and
"llama-stack-client==0.2.12" or use a compatible range like "^0.2.12"), then run
your dependency manager (poetry lock/install or pip) to verify resolution and
update the lockfile accordingly.

@tisnik tisnik force-pushed the lcore-695-bump-up-llama-stack-to-version-0.2.22 branch from 11a42e7 to 1c5c862 Compare October 2, 2025 07:14
@tisnik tisnik merged commit 4978e6a into lightspeed-core:main Oct 2, 2025
18 of 19 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant