Skip to content

Commit ce13e0f

Browse files
release: 0.1.0-alpha.3 (#4)
Automated Release PR --- ## 0.1.0-alpha.3 (2025-06-27) Full Changelog: [v0.1.0-alpha.2...v0.1.0-alpha.3](v0.1.0-alpha.2...v0.1.0-alpha.3) ### Features * **api:** update via SDK Studio ([e87f225](e87f225)) * make custom code changes ([#3](#3)) ([83fa371](83fa371)) --- This pull request is managed by Stainless's [GitHub App](https://github.com/apps/stainless-app). The [semver version number](https://semver.org/#semantic-versioning-specification-semver) is based on included [commit messages](https://www.conventionalcommits.org/en/v1.0.0/). Alternatively, you can manually set the version number in the title of this pull request. For a better experience, it is recommended to use either rebase-merge or squash-merge when merging this pull request. 🔗 Stainless [website](https://www.stainlessapi.com) 📚 Read the [docs](https://app.stainlessapi.com/docs) 🙋 [Reach out](mailto:support@stainlessapi.com) for help or questions --------- Co-authored-by: stainless-app[bot] <142633134+stainless-app[bot]@users.noreply.github.com>
1 parent 83fa371 commit ce13e0f

File tree

10 files changed

+27
-18
lines changed

10 files changed

+27
-18
lines changed

.github/workflows/publish-pypi.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,4 +28,4 @@ jobs:
2828
run: |
2929
bash ./bin/publish-pypi
3030
env:
31-
PYPI_TOKEN: ${{ secrets.LLAMA_STACK_PYPI_TOKEN || secrets.PYPI_TOKEN }}
31+
PYPI_TOKEN: ${{ secrets.LLAMA_STACK_CLIENT_PYPI_TOKEN || secrets.PYPI_TOKEN }}

.github/workflows/release-doctor.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,4 +18,4 @@ jobs:
1818
run: |
1919
bash ./bin/check-release-environment
2020
env:
21-
PYPI_TOKEN: ${{ secrets.LLAMA_STACK_PYPI_TOKEN || secrets.PYPI_TOKEN }}
21+
PYPI_TOKEN: ${{ secrets.LLAMA_STACK_CLIENT_PYPI_TOKEN || secrets.PYPI_TOKEN }}

.release-please-manifest.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
{
2-
".": "0.1.0-alpha.2"
2+
".": "0.1.0-alpha.3"
33
}

.stats.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
configured_endpoints: 96
22
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/llamastack%2Fllama-stack-client-df7a19394e9124c18ec4e888e2856d22b5ebfd6fe6fe6e929ff6cfadb2ae7e2a.yml
33
openapi_spec_hash: 9428682672fdd7e2afee7af9ef849dc9
4-
config_hash: c2377844063fe8b7c43d8b79522fa6fc
4+
config_hash: 3e9fdf542184399384ed713426a8065c

CHANGELOG.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,14 @@
11
# Changelog
22

3+
## 0.1.0-alpha.3 (2025-06-27)
4+
5+
Full Changelog: [v0.1.0-alpha.2...v0.1.0-alpha.3](https://github.com/llamastack/llama-stack-client-python/compare/v0.1.0-alpha.2...v0.1.0-alpha.3)
6+
7+
### Features
8+
9+
* **api:** update via SDK Studio ([e87f225](https://github.com/llamastack/llama-stack-client-python/commit/e87f2257b00a287dd34dc95f4d39661728075891))
10+
* make custom code changes ([#3](https://github.com/llamastack/llama-stack-client-python/issues/3)) ([83fa371](https://github.com/llamastack/llama-stack-client-python/commit/83fa37124133aab73bf2bbbdcd39338b9a192475))
11+
312
## 0.1.0-alpha.2 (2025-06-27)
413

514
Full Changelog: [v0.1.0-alpha.1...v0.1.0-alpha.2](https://github.com/llamastack/llama-stack-client-python/compare/v0.1.0-alpha.1...v0.1.0-alpha.2)

README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ print(model.identifier)
3939

4040
While you can provide an `api_key` keyword argument,
4141
we recommend using [python-dotenv](https://pypi.org/project/python-dotenv/)
42-
to add `LLAMA_STACK_API_KEY="My API Key"` to your `.env` file
42+
to add `LLAMA_STACK_CLIENT_API_KEY="My API Key"` to your `.env` file
4343
so that your API Key is not stored in source control.
4444

4545
## Async usage
@@ -309,10 +309,10 @@ Note that requests that time out are [retried twice by default](#retries).
309309

310310
We use the standard library [`logging`](https://docs.python.org/3/library/logging.html) module.
311311

312-
You can enable logging by setting the environment variable `LLAMA_STACK_LOG` to `info`.
312+
You can enable logging by setting the environment variable `LLAMA_STACK_CLIENT_LOG` to `info`.
313313

314314
```shell
315-
$ export LLAMA_STACK_LOG=info
315+
$ export LLAMA_STACK_CLIENT_LOG=info
316316
```
317317

318318
Or to `debug` for more verbose logging.
@@ -425,7 +425,7 @@ import httpx
425425
from llama_stack_client import LlamaStackClient, DefaultHttpxClient
426426

427427
client = LlamaStackClient(
428-
# Or use the `LLAMA_STACK_BASE_URL` env var
428+
# Or use the `LLAMA_STACK_CLIENT_BASE_URL` env var
429429
base_url="http://my.test.server.example.com:8083",
430430
http_client=DefaultHttpxClient(
431431
proxy="http://my.test.proxy.example.com",

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "llama_stack_client"
3-
version = "0.1.0-alpha.2"
3+
version = "0.1.0-alpha.3"
44
description = "The official Python library for the llama-stack-client API"
55
dynamic = ["readme"]
66
license = "Apache-2.0"

src/llama_stack_client/_client.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -131,14 +131,14 @@ def __init__(
131131
) -> None:
132132
"""Construct a new synchronous LlamaStackClient client instance.
133133
134-
This automatically infers the `api_key` argument from the `LLAMA_STACK_API_KEY` environment variable if it is not provided.
134+
This automatically infers the `api_key` argument from the `LLAMA_STACK_CLIENT_API_KEY` environment variable if it is not provided.
135135
"""
136136
if api_key is None:
137-
api_key = os.environ.get("LLAMA_STACK_API_KEY")
137+
api_key = os.environ.get("LLAMA_STACK_CLIENT_API_KEY")
138138
self.api_key = api_key
139139

140140
if base_url is None:
141-
base_url = os.environ.get("LLAMA_STACK_BASE_URL")
141+
base_url = os.environ.get("LLAMA_STACK_CLIENT_BASE_URL")
142142
if base_url is None:
143143
base_url = f"http://any-hosted-llama-stack.com"
144144

@@ -355,14 +355,14 @@ def __init__(
355355
) -> None:
356356
"""Construct a new async AsyncLlamaStackClient client instance.
357357
358-
This automatically infers the `api_key` argument from the `LLAMA_STACK_API_KEY` environment variable if it is not provided.
358+
This automatically infers the `api_key` argument from the `LLAMA_STACK_CLIENT_API_KEY` environment variable if it is not provided.
359359
"""
360360
if api_key is None:
361-
api_key = os.environ.get("LLAMA_STACK_API_KEY")
361+
api_key = os.environ.get("LLAMA_STACK_CLIENT_API_KEY")
362362
self.api_key = api_key
363363

364364
if base_url is None:
365-
base_url = os.environ.get("LLAMA_STACK_BASE_URL")
365+
base_url = os.environ.get("LLAMA_STACK_CLIENT_BASE_URL")
366366
if base_url is None:
367367
base_url = f"http://any-hosted-llama-stack.com"
368368

src/llama_stack_client/_utils/_logs.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ def _basic_config() -> None:
1616

1717

1818
def setup_logging() -> None:
19-
env = os.environ.get("LLAMA_STACK_LOG")
19+
env = os.environ.get("LLAMA_STACK_CLIENT_LOG")
2020
if env == "debug":
2121
_basic_config()
2222
logger.setLevel(logging.DEBUG)

tests/test_client.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -523,7 +523,7 @@ def test_base_url_setter(self) -> None:
523523
assert client.base_url == "https://example.com/from_setter/"
524524

525525
def test_base_url_env(self) -> None:
526-
with update_env(LLAMA_STACK_BASE_URL="http://localhost:5000/from/env"):
526+
with update_env(LLAMA_STACK_CLIENT_BASE_URL="http://localhost:5000/from/env"):
527527
client = LlamaStackClient(_strict_response_validation=True)
528528
assert client.base_url == "http://localhost:5000/from/env/"
529529

@@ -1337,7 +1337,7 @@ def test_base_url_setter(self) -> None:
13371337
assert client.base_url == "https://example.com/from_setter/"
13381338

13391339
def test_base_url_env(self) -> None:
1340-
with update_env(LLAMA_STACK_BASE_URL="http://localhost:5000/from/env"):
1340+
with update_env(LLAMA_STACK_CLIENT_BASE_URL="http://localhost:5000/from/env"):
13411341
client = AsyncLlamaStackClient(_strict_response_validation=True)
13421342
assert client.base_url == "http://localhost:5000/from/env/"
13431343

0 commit comments

Comments
 (0)