Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ dependencies = [
# Used by authentication/k8s integration
"kubernetes>=30.1.0",
# Used to call Llama Stack APIs
"llama-stack==0.2.21",
"llama-stack-client==0.2.21",
"llama-stack==0.2.22",
"llama-stack-client==0.2.22",
Comment on lines +31 to +32
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Confirm availability of 0.2.22 artifacts.

Pin looks fine, but please double-check that both llama-stack and llama-stack-client 0.2.22 are published and installable to avoid broken builds.

You can verify straight from PyPI with:


🏁 Script executed:

#!/bin/bash
set -euo pipefail

for pkg in llama-stack llama-stack-client; do
  echo "Checking $pkg"
  uv pip index versions "$pkg" | head -n 20
done

Length of output: 62


🏁 Script executed:

#!/bin/bash
set -euo pipefail

for pkg in llama-stack llama-stack-client; do
  echo "Checking $pkg"
  python3 -m pip index versions "$pkg" 2>&1 | head -n20
done

Length of output: 1176


Correct dependency versions: llama-stack==0.2.22 and llama-stack-client==0.2.22 are not available on PyPI (latest is 0.2.12). Update to existing versions or publish the 0.2.22 release.

🤖 Prompt for AI Agents
In pyproject.toml around lines 31-32 the dependencies "llama-stack==0.2.22" and
"llama-stack-client==0.2.22" reference a non-existent PyPI release; change them
to an existing published version (e.g. "llama-stack==0.2.12" and
"llama-stack-client==0.2.12" or use a compatible range like "^0.2.12"), then run
your dependency manager (poetry lock/install or pip) to verify resolution and
update the lockfile accordingly.

# Used by Logger
"rich>=14.0.0",
# Used by JWK token auth handler
Expand Down
2 changes: 1 addition & 1 deletion src/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

# Minimal and maximal supported Llama Stack version
MINIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.17"
MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.21"
MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.22"

UNABLE_TO_PROCESS_RESPONSE = "Unable to process this request"

Expand Down
2 changes: 1 addition & 1 deletion tests/e2e/features/info.feature
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Feature: Info tests
When I access REST API endpoint "info" using HTTP GET method
Then The status code of the response is 200
And The body of the response has proper name Lightspeed Core Service (LCS) and version 0.2.0
And The body of the response has llama-stack version 0.2.21
And The body of the response has llama-stack version 0.2.22

Scenario: Check if info endpoint reports error when llama-stack connection is not working
Given The system is in default state
Expand Down
16 changes: 8 additions & 8 deletions uv.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading