Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,8 @@ dependencies = [
"fastapi>=0.115.6",
"uvicorn>=0.34.3",
"kubernetes>=30.1.0",
"llama-stack==0.2.16",
"llama-stack-client==0.2.16",
"llama-stack==0.2.17",
"llama-stack-client==0.2.17",
Comment on lines +28 to +29
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Verify 0.2.17 exists & still supports Python 3.12, and consider relaxing the hard pin

Good to see both libs kept in sync, but two quick checks would de-risk the bump:

  1. Make sure llama-stack==0.2.17 and llama-stack-client==0.2.17 are published on PyPI and list Python 3.12 in their classifiers.
  2. Hard-pinning (==) stops you from receiving future patch releases (0.2.18, 0.2.19…) that may deliver security fixes. A more typical spec is “at least this version but stay on the 0.2 line”.

Example diff:

-    "llama-stack==0.2.17",
-    "llama-stack-client==0.2.17",
+    "llama-stack>=0.2.17,<0.3",
+    "llama-stack-client>=0.2.17,<0.3",

Quick verification script:


🏁 Script executed:

#!/usr/bin/env bash
set -euo pipefail

for pkg in llama-stack llama-stack-client; do
  echo "Checking $pkg 0.2.17 on PyPI…"
  curl -s "https://pypi.org/pypi/${pkg}/json" | jq -r --arg v "0.2.17" \
    '.releases[$v] | if . == null then "NOT FOUND" else "Found" end'
done

Length of output: 580


Let’s verify that both packages declare Python 3.12 in their PyPI classifiers:


🏁 Script executed:

#!/usr/bin/env bash
set -euo pipefail

for pkg in llama-stack llama-stack-client; do
  echo "Checking Python 3.12 support for $pkg 0.2.17…"
  curl -s "https://pypi.org/pypi/${pkg}/json" \
    | jq -r '.info.classifiers[]' \
    | grep -q "Programming Language :: Python :: 3.12" \
    && echo "✔️ $pkg supports Python 3.12" \
    || echo "$pkg does NOT list Python 3.12"
done

Length of output: 802


Address Python 3.12 support in llama-stack & relax hard pins

Both packages are published at 0.2.17, but the llama-stack release doesn’t list Python 3.12 as supported, which may block installations on 3.12 environments:

• In pyproject.toml (lines 28–29):
– “llama-stack==0.2.17” exists but ❌ lacks a 3.12 classifier
– “llama-stack-client==0.2.17” ✅ supports Python 3.12

Please either ensure llama-stack 0.2.17 is updated to declare Python 3.12 support (or bump to a version that does) and consider loosening the pins so you continue to get patch releases:

-    "llama-stack==0.2.17",
-    "llama-stack-client==0.2.17",
+    "llama-stack>=0.2.17,<0.3",
+    "llama-stack-client>=0.2.17,<0.3",
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"llama-stack==0.2.17",
"llama-stack-client==0.2.17",
"llama-stack>=0.2.17,<0.3",
"llama-stack-client>=0.2.17,<0.3",
🤖 Prompt for AI Agents
In pyproject.toml around lines 28 to 29, the dependency "llama-stack==0.2.17"
lacks declared support for Python 3.12, which may cause installation issues. To
fix this, verify if a newer version of "llama-stack" exists that includes Python
3.12 support and update the version accordingly. Also, relax the version pinning
by changing from exact "==" pins to a compatible range (e.g., "~=") to allow
patch updates while ensuring compatibility.

"rich>=14.0.0",
"cachetools>=6.1.0",
"prometheus-client>=0.22.1",
Expand Down
2 changes: 1 addition & 1 deletion src/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

from typing import Optional

from llama_stack.distribution.library_client import (
from llama_stack import (
AsyncLlamaStackAsLibraryClient, # type: ignore
LlamaStackAsLibraryClient, # type: ignore
)
Expand Down
2 changes: 1 addition & 1 deletion src/utils/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
from logging import Logger

from llama_stack_client import LlamaStackClient, AsyncLlamaStackClient
from llama_stack.distribution.library_client import (
from llama_stack import (
AsyncLlamaStackAsLibraryClient,
)

Expand Down
5 changes: 2 additions & 3 deletions test.containerfile
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@ ENV PATH="$PATH:/root/.local/bin"
WORKDIR ${APP_ROOT}
COPY run.yaml ./


RUN microdnf install -y --nodocs --setopt=keepcache=0 --setopt=tsflags=nodocs \
python3.12 python3.12-devel python3.12-pip git tar

Expand All @@ -17,7 +16,7 @@ RUN curl -LsSf https://astral.sh/uv/install.sh | sh
RUN uv -h

RUN uv venv && \
uv pip install llama-stack==0.2.16 \
uv pip install llama-stack==0.2.17 \
fastapi \
opentelemetry-sdk \
opentelemetry-exporter-otlp \
Expand All @@ -36,4 +35,4 @@ RUN uv venv && \
peft \
trl

CMD ["uv", "run", "llama", "stack", "run", "run.yaml"]
CMD ["uv", "run", "llama", "stack", "run", "run.yaml"]
16 changes: 8 additions & 8 deletions uv.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.