Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -244,7 +244,7 @@ version = "0.1.0"
description = "Llama Stack runner"
authors = []
dependencies = [
"llama-stack==0.2.14",
"llama-stack==0.2.18",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Version bump to llama-stack 0.2.18 looks correct.

Matches the PR goal and is consistent with other updated snippets.

Run to ensure no stale references remain:


🏁 Script executed:

#!/bin/bash
# Find any leftover 0.2.14 references or mismatched llama-stack pins
rg -nP '\b0\.2\.14\b|llama-stack==[0-9]+\.[0-9]+\.[0-9]+' -g '!**/dist/**' -g '!**/.venv/**' -S

# Optional: list all version JSON examples to confirm they read 0.2.18
rg -nP '"version"\s*:\s*"0\.2\.\d+"' docs README.md -S

Length of output: 629


Fix stale llama-stack version references

  • examples/pyproject.llamastack.toml: update llama-stack==0.2.17 (line 7) to 0.2.18.
  • docs/getting_started.md: update llama-stack==0.2.17 (line 27) to 0.2.18.
🤖 Prompt for AI Agents
In README.md around line 247, update the stale llama-stack reference by ensuring
consistency across the repo: change any occurrences of "llama-stack==0.2.17" to
"llama-stack==0.2.18" — specifically edit examples/pyproject.llamastack.toml
(line 7) and docs/getting_started.md (line 27) to replace 0.2.17 with 0.2.18 so
all files match the README.

"fastapi>=0.115.12",
"opentelemetry-sdk>=1.34.0",
"opentelemetry-exporter-otlp>=1.34.0",
Expand Down
6 changes: 3 additions & 3 deletions docs/deployment_guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -390,7 +390,7 @@ cp examples/run.yaml /tmp/llama-stack-server
The output should be in this form:
```json
{
"version": "0.2.14"
"version": "0.2.18"
}
```

Expand Down Expand Up @@ -676,7 +676,7 @@ a4982f43195537b9eb1cec510fe6655f245d6d4b7236a4759808115d5d719972
description = "Default template for PDM package"
authors = []
dependencies = [
"llama-stack==0.2.14",
"llama-stack==0.2.18",
"fastapi>=0.115.12",
"opentelemetry-sdk>=1.34.0",
"opentelemetry-exporter-otlp>=1.34.0",
Expand Down Expand Up @@ -1071,7 +1071,7 @@ models:
The output should be in this form:
```json
{
"version": "0.2.14"
"version": "0.2.18"
}
```

Expand Down
2 changes: 1 addition & 1 deletion docs/getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ It is possible to run Lightspeed Core Stack service with Llama Stack "embedded"
1. Add and install all required dependencies
```bash
uv add \
"llama-stack==0.2.17" \
"llama-stack==0.2.18" \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Version bump to llama-stack==0.2.18: LGTM; verify repo-wide consistency.

Looks good. Please also scan for any lingering 0.2.14–0.2.17 references in docs/examples and confirm 0.2.18 is on PyPI.

Run:


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "Current PyPI version:"
curl -s https://pypi.org/pypi/llama-stack/json | jq -r '.info.version'

echo
echo "Stale version references (expect none):"
rg -nP --glob '!**/dist/**' --glob '!**/build/**' 'llama-stack==0\.2\.(1[0-7]|0?\d)' || true

echo
echo "All pinned occurrences (should mostly be 0.2.18):"
rg -n 'llama-stack=='

Length of output: 1003


Bump llama-stack dependency to 0.2.20 across docs and examples.

PyPI’s latest llama-stack release is 0.2.20—update every llama-stack==0.2.18 occurrence (README.md:247, pyproject.toml:31, examples/pyproject.llamastack.toml:7, docs/getting_started.md:27, docs/deployment_guide.md:679) to ==0.2.20 and verify no pins to 0.2.14–0.2.17 remain.

🤖 Prompt for AI Agents
In docs/getting_started.md around line 27, the llama-stack dependency is pinned
to "llama-stack==0.2.18"; update this occurrence to "llama-stack==0.2.20" and
save the file. After making this change, search the repo for any remaining pins
to 0.2.14–0.2.18 and replace them with 0.2.20 (specifically check README.md line
~247, pyproject.toml line ~31, examples/pyproject.llamastack.toml line ~7, and
docs/deployment_guide.md line ~679) to ensure all references are consistent.

"fastapi>=0.115.12" \
"opentelemetry-sdk>=1.34.0" \
"opentelemetry-exporter-otlp>=1.34.0" \
Expand Down
2 changes: 1 addition & 1 deletion examples/pyproject.llamastack.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ version = "0.1.0"
description = "Default template for PDM package"
authors = []
dependencies = [
"llama-stack==0.2.17",
"llama-stack==0.2.18",
"fastapi>=0.115.12",
"opentelemetry-sdk>=1.34.0",
"opentelemetry-exporter-otlp>=1.34.0",
Expand Down