Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -245,7 +245,7 @@ version = "0.1.0"
description = "Llama Stack runner"
authors = []
dependencies = [
"llama-stack==0.2.19",
"llama-stack==0.2.20",
"fastapi>=0.115.12",
"opentelemetry-sdk>=1.34.0",
"opentelemetry-exporter-otlp>=1.34.0",
Expand Down
2 changes: 1 addition & 1 deletion docs/deployment_guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -390,7 +390,7 @@ cp examples/run.yaml /tmp/llama-stack-server
The output should be in this form:
```json
{
"version": "0.2.19"
"version": "0.2.20"
}
```

Expand Down
2 changes: 1 addition & 1 deletion docs/getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ It is possible to run Lightspeed Core Stack service with Llama Stack "embedded"
1. Add and install all required dependencies
```bash
uv add \
"llama-stack==0.2.19" \
"llama-stack==0.2.20" \
"fastapi>=0.115.12" \
"opentelemetry-sdk>=1.34.0" \
"opentelemetry-exporter-otlp>=1.34.0" \
Expand Down
2 changes: 1 addition & 1 deletion docs/openapi.json
Original file line number Diff line number Diff line change
Expand Up @@ -1716,7 +1716,7 @@
"llama_stack_version"
],
"title": "InfoResponse",
"description": "Model representing a response to an info request.\n\nAttributes:\n name: Service name.\n service_version: Service version.\n llama_stack_version: Llama Stack version.\n\nExample:\n ```python\n info_response = InfoResponse(\n name=\"Lightspeed Stack\",\n service_version=\"1.0.0\",\n llama_stack_version=\"0.2.19\",\n )\n ```",
"description": "Model representing a response to an info request.\n\nAttributes:\n name: Service name.\n service_version: Service version.\n llama_stack_version: Llama Stack version.\n\nExample:\n ```python\n info_response = InfoResponse(\n name=\"Lightspeed Stack\",\n service_version=\"1.0.0\",\n llama_stack_version=\"0.2.20\",\n )\n ```",
"examples": [
{
"llama_stack_version": "1.0.0",
Expand Down
4 changes: 2 additions & 2 deletions docs/openapi.md
Original file line number Diff line number Diff line change
Expand Up @@ -779,7 +779,7 @@ Example:
llm_response="You need to use Docker and Kubernetes for everything.",
user_feedback="This response is too general and doesn't provide specific steps.",
sentiment=-1,
categories=["incomplete", "not_relevant"]
categories=[FeedbackCategory.INCORRECT, FeedbackCategory.INCOMPLETE]
)
```

Expand Down Expand Up @@ -907,7 +907,7 @@ Example:
info_response = InfoResponse(
name="Lightspeed Stack",
service_version="1.0.0",
llama_stack_version="0.2.19",
llama_stack_version="0.2.20",
)
```

Expand Down
2 changes: 1 addition & 1 deletion docs/output.md
Original file line number Diff line number Diff line change
Expand Up @@ -898,7 +898,7 @@ Example:
info_response = InfoResponse(
name="Lightspeed Stack",
service_version="1.0.0",
llama_stack_version="0.2.19",
llama_stack_version="0.2.20",
)
```

Expand Down
2 changes: 1 addition & 1 deletion examples/pyproject.llamastack.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ version = "0.1.0"
description = "Default template for PDM package"
authors = []
dependencies = [
"llama-stack==0.2.19",
"llama-stack==0.2.20",
"fastapi>=0.115.12",
"opentelemetry-sdk>=1.34.0",
"opentelemetry-exporter-otlp>=1.34.0",
Expand Down
2 changes: 1 addition & 1 deletion src/models/responses.py
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ class InfoResponse(BaseModel):
info_response = InfoResponse(
name="Lightspeed Stack",
service_version="1.0.0",
llama_stack_version="0.2.19",
llama_stack_version="0.2.20",
)
```
"""
Expand Down
2 changes: 1 addition & 1 deletion tests/e2e/features/info.feature
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Feature: Info tests
When I access REST API endpoint "info" using HTTP GET method
Then The status code of the response is 200
And The body of the response has proper name Lightspeed Core Service (LCS) and version 0.2.0
And The body of the response has llama-stack version 0.2.19
And The body of the response has llama-stack version 0.2.20

Scenario: Check if info endpoint reports error when llama-stack connection is not working
Given The system is in default state
Expand Down
Loading