Skip to content

Conversation

@tisnik
Copy link
Contributor

@tisnik tisnik commented Sep 9, 2025

Description

LCORE-633: bump-up Llama Stack version to 0.2.19

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up service version
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change
  • Unit tests improvement
  • Integration tests improvement
  • End to end tests improvement

Related Tickets & Documents

  • Related Issue #LCORE-633

Summary by CodeRabbit

  • Chores
    • Updated dependencies: bumped llama-stack and llama-stack-client from 0.2.18 to 0.2.19.
    • Minor dependency version bump intended to maintain compatibility with recent tooling and services.
    • No changes to application features, configuration behavior, or public APIs/exported entities.
    • No user-facing behavior or settings adjustments required.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Sep 9, 2025

Warning

Rate limit exceeded

@tisnik has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 8 minutes and 0 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📥 Commits

Reviewing files that changed from the base of the PR and between 892c26f and 854b6a1.

📒 Files selected for processing (2)
  • .github/workflows/e2e_tests.yaml (1 hunks)
  • run.yaml (1 hunks)

Walkthrough

Bumps Llama Stack dependency from 0.2.18 to 0.2.19 in pyproject.toml and updates the repository constant MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION to "0.2.19". No other code, API, or behavior changes.

Changes

Cohort / File(s) Summary
Dependency manifest
pyproject.toml
Updated dependencies: llama-stack 0.2.18 → 0.2.19; llama-stack-client 0.2.18 → 0.2.19. No other edits.
Version constant
src/constants.py
Updated MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION from "0.2.18""0.2.19". No logic or public API changes.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Possibly related PRs

Suggested reviewers

  • umago

Pre-merge checks (3 passed)

✅ Passed checks (3 passed)
Check name Status Explanation
Title Check ✅ Passed The title concisely describes the primary change—upgrading the Llama Stack dependency to version 0.2.19—and directly aligns with the modifications in the pull request without including unnecessary details.
Description Check ✅ Passed The description clearly outlines the purpose of the pull request by specifying the dependency bump, categorizing the type of change, and linking to the related issue, making it directly relevant to the changeset.
Docstring Coverage ✅ Passed No functions found in the changes. Docstring coverage check skipped.

Poem

I hop a notch, a tiny thing,
From .18 I give a spring.
A carrot bump, dependencies sing—
Soft paws, small change, a brighter ring. 🥕

✨ Finishing Touches
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 4812089 and 892c26f.

📒 Files selected for processing (1)
  • src/constants.py (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: build-pr

# Minimal and maximal supported Llama Stack version
MINIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.17"
MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.18"
MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.19"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

LGTM: max supported version bumped to 0.2.19 and matches the PR intent.
Looks consistent with the dependency upgrade. Consider verifying no stale 0.2.18 references and that min/max ordering remains valid.

Run:


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "Check pyproject for llama-stack/client pins (expect 0.2.19):"
rg -nP 'llama-stack(-client)?\s*=?\s*["'\'']?0\.2\.\d+["'\'']?' pyproject.toml || true

echo "Find stale references to 0.2.18 anywhere:"
rg -n '0\.2\.18\b' -g '!**/dist/**' -g '!**/build/**' || true

echo "Validate MIN<=MAX in src/constants.py:"
python - <<'PY'
import re
min_v=max_v=None
for line in open("src/constants.py", encoding="utf-8"):
    if "MINIMAL_SUPPORTED_LLAMA_STACK_VERSION" in line:
        min_v=re.search(r'"([^"]+)"', line).group(1)
    if "MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION" in line:
        max_v=re.search(r'"([^"]+)"', line).group(1)
def parse(v): return tuple(map(int, v.split(".")))
assert parse(min_v)<=parse(max_v), f"Invalid range: {min_v}>{max_v}"
print(f"OK: {min_v}..{max_v}")
PY

Length of output: 309


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "Dependencies in pyproject.toml:"
rg -n '^\s*llama-stack(-client)?' -C2 pyproject.toml || true

echo "Pin versions in pyproject.toml:"
rg -n '0\.2\.\d+' -C0 pyproject.toml || true

echo "Find stale refs to 0.2.18:"
rg -n '0\.2\.18\b' -C2 . || true

Length of output: 3615


Update stale 0.2.18 references in documentation and examples
Multiple example and doc files still pin “0.2.18” (e.g. examples/pyproject.llamastack.toml, src/models/responses.py examples, README.md, docs/openapi.md, docs/openapi.json, docs/output.md, docs/getting_started.md, docs/deployment_guide.md); bump all of these to “0.2.19”. Version range in src/constants.py (0.2.17..0.2.19) is valid.

@tisnik
Copy link
Contributor Author

tisnik commented Sep 9, 2025

@radofuchs Hi, where's the problem there? Can not see logs from llama-stack container, so can't tell what's wrong

@tisnik tisnik merged commit fdd00e1 into lightspeed-core:main Sep 10, 2025
18 of 19 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant