From 8d1801bd78b00ff08d25331a9e0b8b11848b95f0 Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 11:51:04 +0200 Subject: [PATCH 01/13] Reorganize LLM documentation around Agent SDK Add Agent SDK language model pages and link OpenHands guides to them. --- docs.json | 10 +++- openhands/usage/llms/llms.mdx | 17 ++++-- sdk/llms/configuration.mdx | 96 ++++++++++++++++++++++++++++++ sdk/llms/index.mdx | 108 ++++++++++++++++++++++++++++++++++ sdk/llms/providers.mdx | 27 +++++++++ 5 files changed, 253 insertions(+), 5 deletions(-) create mode 100644 sdk/llms/configuration.mdx create mode 100644 sdk/llms/index.mdx create mode 100644 sdk/llms/providers.mdx diff --git a/docs.json b/docs.json index ffc68d62..cb8a661b 100644 --- a/docs.json +++ b/docs.json @@ -176,7 +176,15 @@ { "tab": "Agent SDK (v1)", "pages": [ - "sdk/index" + "sdk/index", + { + "group": "Language Models", + "pages": [ + "sdk/llms/index", + "sdk/llms/configuration", + "sdk/llms/providers" + ] + } ] }, { diff --git a/openhands/usage/llms/llms.mdx b/openhands/usage/llms/llms.mdx index b92bf49b..8f027e6e 100644 --- a/openhands/usage/llms/llms.mdx +++ b/openhands/usage/llms/llms.mdx @@ -7,6 +7,12 @@ description: OpenHands can connect to any LLM supported by LiteLLM. However, it This section is for users who want to connect OpenHands to different LLMs. + +OpenHands now delegates all LLM orchestration to the Agent SDK. The guidance on this +page focuses on how the OpenHands interfaces surface those capabilities. When in doubt, refer to the SDK documentation +for the canonical list of supported parameters. + + ## Model Recommendations Based on our evaluations of language models for coding tasks (using the SWE-bench dataset), we can provide some @@ -54,16 +60,16 @@ models driving it. However, if you do find ones that work, please add them to th ## LLM Configuration -The following can be set in the OpenHands UI through the Settings: +The following can be set in the OpenHands UI through the Settings. Each option is serialized into the +[`LLM.load_from_env()` schema](/sdk/llms/configuration) before being passed to the Agent SDK: - `LLM Provider` - `LLM Model` - `API Key` - `Base URL` (through `Advanced` settings) -There are some settings that may be necessary for some LLMs/providers that cannot be set through the UI. Instead, these -can be set through environment variables passed to the docker run command when starting the app -using `-e`: +There are some settings that may be necessary for certain providers that cannot be set directly through the UI. Set them +as environment variables (or add them to your `config.toml`) so the SDK can ingest them on startup: - `LLM_API_VERSION` - `LLM_EMBEDDING_MODEL` @@ -86,6 +92,9 @@ We have a few guides for running OpenHands with specific model providers: - [OpenHands](/openhands/usage/llms/openhands-llms) - [OpenRouter](/openhands/usage/llms/openrouter) +These pages remain the authoritative provider references for both the Agent SDK +and the OpenHands interfaces. + ## Model Customization LLM providers have specific settings that can be customized to optimize their performance with OpenHands, such as: diff --git a/sdk/llms/configuration.mdx b/sdk/llms/configuration.mdx new file mode 100644 index 00000000..f9831f7e --- /dev/null +++ b/sdk/llms/configuration.mdx @@ -0,0 +1,96 @@ +--- +title: LLM configuration details +description: Configure LLM objects in the Agent SDK and reuse them across OpenHands interfaces. +--- + +This page expands on how to configure `openhands.sdk.llm.LLM` instances. The +same options back every interface built on top of the Agent SDK—including the +OpenHands UI—so documenting them here keeps the configuration story consistent. + +## Environment variable layout + +The SDK expects environment variables prefixed with `LLM_`. They are lowercased +and mapped onto field names when you call `LLM.load_from_env()`. + +```bash +export LLM_MODEL="anthropic/claude-sonnet-4.1" +export LLM_API_KEY="sk-ant-123" +export LLM_SERVICE_ID="primary" +export LLM_TIMEOUT="120" +export LLM_NUM_RETRIES="5" +``` + +Then, in Python: + +```python +from openhands.sdk import LLM + +llm = LLM.load_from_env() +``` + +The loader automatically casts values into integers, floats, booleans, JSON, or +`SecretStr` where appropriate. Unknown variables are ignored, so it is safe to +store extra settings alongside LLM keys in `.env` files or secret managers. + +## JSON configuration + +For declarative deployments you can persist the SDK model schema to JSON: + +```python +from pydantic import SecretStr + +llm = LLM( + model="openai/gpt-4o", + api_key=SecretStr("sk-openai"), + temperature=0.1, +) +with open("config/llm.json", "w") as fp: + fp.write(llm.model_dump_json(exclude_none=True, indent=2)) + +reloaded = LLM.load_from_json("config/llm.json") +``` + +Serialized structures redact secrets (API keys, AWS credentials). Combine the +JSON file with environment variables for secrets when your runtime requires +human review of committed configuration. + +## Commonly tuned parameters + +- **Latency & retry controls**: `timeout`, `num_retries`, `retry_min_wait`, + `retry_max_wait`, and `retry_multiplier` govern LiteLLM retry behavior across + providers. +- **Prompt shaping**: `temperature`, `top_p`, `top_k`, `reasoning_effort`, and + `extended_thinking_budget` adjust sampling characteristics and Anthropic + reasoning budgets. +- **Cost reporting**: `input_cost_per_token` and `output_cost_per_token` flow + into SDK telemetry so downstream interfaces can display usage estimates. +- **Provider specific knobs**: fields such as `api_version` (Azure), + `ollama_base_url`, `custom_llm_provider`, and `openrouter_site_url` allow + compatibility with LiteLLM's provider adapters without custom glue code. + +Refer to the docstring within +[`openhands.sdk.llm.LLM`](https://github.com/All-Hands-AI/agent-sdk/blob/main/openhands-sdk/openhands/sdk/llm/llm.py) +for the full schema. All fields can be set programmatically or via environment +variables using the naming rule `field -> LLM_FIELD`. + +## Multiple profiles + +Many applications require separate model profiles (for example, a premium coding +model and a cheaper summarizer). Pair environment prefixes with registry +service IDs: + +```bash +export CODER_LLM_MODEL="anthropic/claude-sonnet-4.1" +export CODER_LLM_API_KEY="sk-ant" +export SUMMARIZER_LLM_MODEL="openai/gpt-4o-mini" +export SUMMARIZER_LLM_API_KEY="sk-openai" +``` + +```python +coder = LLM.load_from_env(prefix="CODER_LLM_").model_copy(update={"service_id": "coder"}) +summarizer = LLM.load_from_env(prefix="SUMMARIZER_LLM_").model_copy(update={"service_id": "summarizer"}) +``` + +These instances can then be registered with `LLMRegistry` and shared across +agents or microservices. The OpenHands UI uses the same pattern when end users +create named profiles. diff --git a/sdk/llms/index.mdx b/sdk/llms/index.mdx new file mode 100644 index 00000000..9feb6da4 --- /dev/null +++ b/sdk/llms/index.mdx @@ -0,0 +1,108 @@ +--- +title: Language Models +description: Understand how the Agent SDK manages language models and how OpenHands interfaces build on top of it. +--- + +The Agent SDK owns all language model (LLM) orchestration in OpenHands **v1**. The +standalone OpenHands repository now focuses on user interfaces (web app, CLI, +cloud) that talk to the SDK. This page explains how LLMs fit into that new +architecture and shows the core APIs you will use when building your own +interfaces or automations. + +## Architecture overview + +- **Agent SDK = source of truth.** The SDK defines the `LLM` model, request + pipeline, retries, telemetry, and registry. +- **Interfaces reuse the same LLM objects.** The OpenHands UI or CLI simply + hydrate an SDK `LLM` from persisted settings and pass it to an agent. +- **Consistent configuration.** Whether you launch an agent programmatically or + via the OpenHands UI, the supported parameters and defaults come from the SDK. + +```mermaid +graph LR + User[User or automation] + UI[OpenHands UI / CLI] + SDK[Agent SDK] + LLMService[(External LLM providers)] + + User --> UI + UI --> SDK + SDK -->|LLM instances| SDK + SDK --> LLMService +``` + +## Creating LLM instances + +Use the [`openhands.sdk.llm.LLM`](https://github.com/All-Hands-AI/agent-sdk/blob/main/openhands-sdk/openhands/sdk/llm/llm.py) +class to configure model access. The only required field is the `model` name; +other options (API keys, retry tuning, response tracing) are optional. + +```python +from pydantic import SecretStr +from openhands.sdk import LLM + +llm = LLM( + model="anthropic/claude-sonnet-4.1", + api_key=SecretStr("sk-ant-123"), + service_id="primary", + temperature=0.1, + timeout=120, +) +``` + +Key concepts: + +- **`service_id`** identifies an LLM configuration when storing it in a registry + or persisting it between runs. +- **Retry settings** (`num_retries`, `retry_min_wait`, etc.) apply uniformly to + all providers through LiteLLM. +- **Cost metadata** (`input_cost_per_token`, `output_cost_per_token`) feeds into + SDK telemetry and logs for downstream UIs. + +## Loading configuration from environments or files + +Use helper constructors when you need to rehydrate an LLM from configuration +state: + +```python +llm = LLM.load_from_env(prefix="LLM_") +``` + +This reads environment variables such as `LLM_MODEL`, `LLM_API_KEY`, or +`LLM_TIMEOUT` and casts them into the appropriate types. Interfaces like the +OpenHands UI persist settings using this convention so that the SDK can read them +without additional glue code. For JSON based workflows the SDK also exposes +`LLM.load_from_json("config/llm.json")`. + +Learn more about configuration options in [LLM configuration details](./configuration). + +## Managing multiple LLMs + +When an application needs several LLM profiles (for example, a high-quality +model for coding and a cheaper summarizer) use the +[`LLMRegistry`](https://github.com/All-Hands-AI/agent-sdk/blob/main/openhands-sdk/openhands/sdk/llm/llm_registry.py): + +```python +from openhands.sdk.llm import LLM, LLMRegistry + +registry = LLMRegistry() +registry.add(LLM(model="anthropic/claude-sonnet-4.1", service_id="primary")) +registry.add(LLM(model="openai/gpt-4o-mini", service_id="summaries")) + +code_llm = registry.get("primary") +summarizer = registry.get("summaries") +``` + +Registries ensure that agents launched by different parts of your application +reuse the same configuration objects, making retries and telemetry consistent. + +## Relationship with OpenHands interfaces + +The OpenHands repository (UI/CLI) now consumes these SDK APIs. When you save LLM +settings in the UI they are stored using the same environment variable schema +and serialized into `LLM` objects before each run. That means any customization +documented here automatically applies to the OpenHands products. + +Provider-specific guidance—pricing summaries, required parameters, or proxy +setups—remains valid and lives alongside the existing OpenHands documentation. +See [LLM provider guides](./providers) for links to those pages. diff --git a/sdk/llms/providers.mdx b/sdk/llms/providers.mdx new file mode 100644 index 00000000..6e38bb3b --- /dev/null +++ b/sdk/llms/providers.mdx @@ -0,0 +1,27 @@ +--- +title: LLM provider guides +description: Provider-specific notes for configuring Agent SDK and OpenHands interfaces. +--- + +Provider integrations remain shared between the Agent SDK and the OpenHands UI. +The pages linked below live under the historical OpenHands section but apply +verbatim to SDK applications because both layers wrap the same +`openhands.sdk.llm.LLM` interface. + +| Provider / scenario | Documentation | +| --- | --- | +| OpenHands hosted models | [/openhands/usage/llms/openhands-llms](/openhands/usage/llms/openhands-llms) | +| OpenAI | [/openhands/usage/llms/openai-llms](/openhands/usage/llms/openai-llms) | +| Azure OpenAI | [/openhands/usage/llms/azure-llms](/openhands/usage/llms/azure-llms) | +| Google Gemini / Vertex | [/openhands/usage/llms/google-llms](/openhands/usage/llms/google-llms) | +| Groq | [/openhands/usage/llms/groq](/openhands/usage/llms/groq) | +| OpenRouter | [/openhands/usage/llms/openrouter](/openhands/usage/llms/openrouter) | +| Moonshot | [/openhands/usage/llms/moonshot](/openhands/usage/llms/moonshot) | +| LiteLLM proxy | [/openhands/usage/llms/litellm-proxy](/openhands/usage/llms/litellm-proxy) | +| Local LLMs (Ollama, SGLang, vLLM, LM Studio) | [/openhands/usage/llms/local-llms](/openhands/usage/llms/local-llms) | +| Custom LLM configurations | [/openhands/usage/llms/custom-llm-configs](/openhands/usage/llms/custom-llm-configs) | + +When you follow any of those guides while building with the SDK, create an +`LLM` object using the documented parameters (for example, API keys, base URLs, +or custom headers) and pass it into your agent or registry. The OpenHands UI +surfacing is simply a convenience layer on top of the same configuration model. From e35a0f3365c87e9ac2ec8bb4a06b145f947666b5 Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 12:02:45 +0200 Subject: [PATCH 02/13] Tweak env wording in LLM docs --- openhands/usage/llms/llms.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/openhands/usage/llms/llms.mdx b/openhands/usage/llms/llms.mdx index 8f027e6e..ea09443f 100644 --- a/openhands/usage/llms/llms.mdx +++ b/openhands/usage/llms/llms.mdx @@ -69,7 +69,7 @@ The following can be set in the OpenHands UI through the Settings. Each option i - `Base URL` (through `Advanced` settings) There are some settings that may be necessary for certain providers that cannot be set directly through the UI. Set them -as environment variables (or add them to your `config.toml`) so the SDK can ingest them on startup: +as environment variables (or add them to your `config.toml`) so the SDK picks them up during startup: - `LLM_API_VERSION` - `LLM_EMBEDDING_MODEL` From 0bb496058e1882d705b8eeb9a80b9aea2814747f Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 12:04:43 +0200 Subject: [PATCH 03/13] Remove extra env note --- sdk/llms/configuration.mdx | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/sdk/llms/configuration.mdx b/sdk/llms/configuration.mdx index f9831f7e..586839fb 100644 --- a/sdk/llms/configuration.mdx +++ b/sdk/llms/configuration.mdx @@ -29,8 +29,7 @@ llm = LLM.load_from_env() ``` The loader automatically casts values into integers, floats, booleans, JSON, or -`SecretStr` where appropriate. Unknown variables are ignored, so it is safe to -store extra settings alongside LLM keys in `.env` files or secret managers. +`SecretStr` where appropriate. ## JSON configuration From 2762a92b70b7a14b46b01868f54eb32a3c8d3f92 Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 12:06:03 +0200 Subject: [PATCH 04/13] Clarify retry behavior wording --- sdk/llms/configuration.mdx | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/sdk/llms/configuration.mdx b/sdk/llms/configuration.mdx index 586839fb..d7b5fd20 100644 --- a/sdk/llms/configuration.mdx +++ b/sdk/llms/configuration.mdx @@ -56,8 +56,8 @@ human review of committed configuration. ## Commonly tuned parameters - **Latency & retry controls**: `timeout`, `num_retries`, `retry_min_wait`, - `retry_max_wait`, and `retry_multiplier` govern LiteLLM retry behavior across - providers. + `retry_max_wait`, and `retry_multiplier` govern the SDK's LLM retry behavior + across providers. - **Prompt shaping**: `temperature`, `top_p`, `top_k`, `reasoning_effort`, and `extended_thinking_budget` adjust sampling characteristics and Anthropic reasoning budgets. From 5e2221604beceb94f03eb405ebc05b62753c324f Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 12:09:52 +0200 Subject: [PATCH 05/13] Drop unused provider knobs note --- sdk/llms/configuration.mdx | 3 --- 1 file changed, 3 deletions(-) diff --git a/sdk/llms/configuration.mdx b/sdk/llms/configuration.mdx index d7b5fd20..639296c1 100644 --- a/sdk/llms/configuration.mdx +++ b/sdk/llms/configuration.mdx @@ -63,9 +63,6 @@ human review of committed configuration. reasoning budgets. - **Cost reporting**: `input_cost_per_token` and `output_cost_per_token` flow into SDK telemetry so downstream interfaces can display usage estimates. -- **Provider specific knobs**: fields such as `api_version` (Azure), - `ollama_base_url`, `custom_llm_provider`, and `openrouter_site_url` allow - compatibility with LiteLLM's provider adapters without custom glue code. Refer to the docstring within [`openhands.sdk.llm.LLM`](https://github.com/All-Hands-AI/agent-sdk/blob/main/openhands-sdk/openhands/sdk/llm/llm.py) From 26a498002256c1700e7391f63d5f0d38b1450494 Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 12:14:18 +0200 Subject: [PATCH 06/13] Clarify UI persistence wording --- sdk/llms/index.mdx | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/sdk/llms/index.mdx b/sdk/llms/index.mdx index 9feb6da4..7606034d 100644 --- a/sdk/llms/index.mdx +++ b/sdk/llms/index.mdx @@ -98,10 +98,10 @@ reuse the same configuration objects, making retries and telemetry consistent. ## Relationship with OpenHands interfaces -The OpenHands repository (UI/CLI) now consumes these SDK APIs. When you save LLM -settings in the UI they are stored using the same environment variable schema -and serialized into `LLM` objects before each run. That means any customization -documented here automatically applies to the OpenHands products. +The OpenHands repository (UI/CLI) now consumes these SDK APIs. When you adjust +LLM settings in the interfaces they are persisted and reloaded into SDK `LLM` +objects before each run, so any customization documented here carries over to +the product experiences. Provider-specific guidance—pricing summaries, required parameters, or proxy setups—remains valid and lives alongside the existing OpenHands documentation. From 8aec043d591c5cce660996600f5b41aa62f69cdd Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 12:19:30 +0200 Subject: [PATCH 07/13] Adjust wording to avoid product term --- sdk/llms/index.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/sdk/llms/index.mdx b/sdk/llms/index.mdx index 7606034d..fb71c5b2 100644 --- a/sdk/llms/index.mdx +++ b/sdk/llms/index.mdx @@ -101,7 +101,7 @@ reuse the same configuration objects, making retries and telemetry consistent. The OpenHands repository (UI/CLI) now consumes these SDK APIs. When you adjust LLM settings in the interfaces they are persisted and reloaded into SDK `LLM` objects before each run, so any customization documented here carries over to -the product experiences. +those application experiences. Provider-specific guidance—pricing summaries, required parameters, or proxy setups—remains valid and lives alongside the existing OpenHands documentation. From cf6dda4439b468f141a708b3646d65115b943074 Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 12:24:20 +0200 Subject: [PATCH 08/13] Rename LLM nav titles --- sdk/llms/configuration.mdx | 2 +- sdk/llms/index.mdx | 2 +- sdk/llms/providers.mdx | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/sdk/llms/configuration.mdx b/sdk/llms/configuration.mdx index 639296c1..74027f49 100644 --- a/sdk/llms/configuration.mdx +++ b/sdk/llms/configuration.mdx @@ -1,5 +1,5 @@ --- -title: LLM configuration details +title: Configuration description: Configure LLM objects in the Agent SDK and reuse them across OpenHands interfaces. --- diff --git a/sdk/llms/index.mdx b/sdk/llms/index.mdx index fb71c5b2..d2df4387 100644 --- a/sdk/llms/index.mdx +++ b/sdk/llms/index.mdx @@ -1,5 +1,5 @@ --- -title: Language Models +title: Overview description: Understand how the Agent SDK manages language models and how OpenHands interfaces build on top of it. --- diff --git a/sdk/llms/providers.mdx b/sdk/llms/providers.mdx index 6e38bb3b..574aad79 100644 --- a/sdk/llms/providers.mdx +++ b/sdk/llms/providers.mdx @@ -1,5 +1,5 @@ --- -title: LLM provider guides +title: Provider Guides description: Provider-specific notes for configuring Agent SDK and OpenHands interfaces. --- From a0850287873f528721a6b97ec4de881fe721d74d Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 12:25:34 +0200 Subject: [PATCH 09/13] Refine overview description --- sdk/llms/index.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/sdk/llms/index.mdx b/sdk/llms/index.mdx index d2df4387..25ba4a67 100644 --- a/sdk/llms/index.mdx +++ b/sdk/llms/index.mdx @@ -1,6 +1,6 @@ --- title: Overview -description: Understand how the Agent SDK manages language models and how OpenHands interfaces build on top of it. +description: Architecture of Agent SDK language models and their relationship with OpenHands interfaces. --- The Agent SDK owns all language model (LLM) orchestration in OpenHands **v1**. The From 25d911f73d8bc86ad4ffe23f06bcd0df8491c184 Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 12:27:56 +0200 Subject: [PATCH 10/13] Clarify configuration intro --- sdk/llms/configuration.mdx | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/sdk/llms/configuration.mdx b/sdk/llms/configuration.mdx index 74027f49..fbf852cd 100644 --- a/sdk/llms/configuration.mdx +++ b/sdk/llms/configuration.mdx @@ -3,9 +3,9 @@ title: Configuration description: Configure LLM objects in the Agent SDK and reuse them across OpenHands interfaces. --- -This page expands on how to configure `openhands.sdk.llm.LLM` instances. The -same options back every interface built on top of the Agent SDK—including the -OpenHands UI—so documenting them here keeps the configuration story consistent. +Configure `openhands.sdk.llm.LLM` instances with the parameters described in this +section. These fields apply identically whether you launch agents through your +own code or via the OpenHands interfaces built on the SDK. ## Environment variable layout From d128f3955f421ab2d82a9f830aea54e6b9225354 Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 12:29:16 +0200 Subject: [PATCH 11/13] Tune overview intro tone --- sdk/llms/index.mdx | 11 +++++------ 1 file changed, 5 insertions(+), 6 deletions(-) diff --git a/sdk/llms/index.mdx b/sdk/llms/index.mdx index 25ba4a67..9ae2a01b 100644 --- a/sdk/llms/index.mdx +++ b/sdk/llms/index.mdx @@ -1,13 +1,12 @@ --- title: Overview -description: Architecture of Agent SDK language models and their relationship with OpenHands interfaces. +description: How Agent SDK language models work and how OpenHands interfaces connect to them. --- -The Agent SDK owns all language model (LLM) orchestration in OpenHands **v1**. The -standalone OpenHands repository now focuses on user interfaces (web app, CLI, -cloud) that talk to the SDK. This page explains how LLMs fit into that new -architecture and shows the core APIs you will use when building your own -interfaces or automations. +Agent SDK handles all language model (LLM) orchestration in OpenHands **v1**. The +OpenHands repository now provides the interfaces—web app, CLI, and cloud—that +call into the SDK. Use this overview to understand the architecture and the core +APIs available when you build your own integrations or automations. ## Architecture overview From 445a2f03bb1126cc0a10781c6912fdd247106fd3 Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 12:32:13 +0200 Subject: [PATCH 12/13] Redraw LLM client flow diagram --- sdk/llms/index.mdx | 14 ++++++++++---- 1 file changed, 10 insertions(+), 4 deletions(-) diff --git a/sdk/llms/index.mdx b/sdk/llms/index.mdx index 9ae2a01b..cdbbabd7 100644 --- a/sdk/llms/index.mdx +++ b/sdk/llms/index.mdx @@ -19,14 +19,20 @@ APIs available when you build your own integrations or automations. ```mermaid graph LR - User[User or automation] - UI[OpenHands UI / CLI] + subgraph Interfaces + UI[OpenHands UI] + CLI[OpenHands CLI] + Automation[Automations & workflows] + Custom[Your client] + end + SDK[Agent SDK] LLMService[(External LLM providers)] - User --> UI UI --> SDK - SDK -->|LLM instances| SDK + CLI --> SDK + Automation --> SDK + Custom --> SDK SDK --> LLMService ``` From 678a3fad10fb3cf0b42ec4169bf6f8f9d8cc7773 Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Mon, 20 Oct 2025 13:12:28 +0200 Subject: [PATCH 13/13] Remove multi-profile docs --- sdk/llms/configuration.mdx | 21 --------------------- sdk/llms/index.mdx | 20 -------------------- 2 files changed, 41 deletions(-) diff --git a/sdk/llms/configuration.mdx b/sdk/llms/configuration.mdx index fbf852cd..f6b3bf55 100644 --- a/sdk/llms/configuration.mdx +++ b/sdk/llms/configuration.mdx @@ -69,24 +69,3 @@ Refer to the docstring within for the full schema. All fields can be set programmatically or via environment variables using the naming rule `field -> LLM_FIELD`. -## Multiple profiles - -Many applications require separate model profiles (for example, a premium coding -model and a cheaper summarizer). Pair environment prefixes with registry -service IDs: - -```bash -export CODER_LLM_MODEL="anthropic/claude-sonnet-4.1" -export CODER_LLM_API_KEY="sk-ant" -export SUMMARIZER_LLM_MODEL="openai/gpt-4o-mini" -export SUMMARIZER_LLM_API_KEY="sk-openai" -``` - -```python -coder = LLM.load_from_env(prefix="CODER_LLM_").model_copy(update={"service_id": "coder"}) -summarizer = LLM.load_from_env(prefix="SUMMARIZER_LLM_").model_copy(update={"service_id": "summarizer"}) -``` - -These instances can then be registered with `LLMRegistry` and shared across -agents or microservices. The OpenHands UI uses the same pattern when end users -create named profiles. diff --git a/sdk/llms/index.mdx b/sdk/llms/index.mdx index cdbbabd7..23de5cfc 100644 --- a/sdk/llms/index.mdx +++ b/sdk/llms/index.mdx @@ -81,26 +81,6 @@ without additional glue code. For JSON based workflows the SDK also exposes Learn more about configuration options in [LLM configuration details](./configuration). -## Managing multiple LLMs - -When an application needs several LLM profiles (for example, a high-quality -model for coding and a cheaper summarizer) use the -[`LLMRegistry`](https://github.com/All-Hands-AI/agent-sdk/blob/main/openhands-sdk/openhands/sdk/llm/llm_registry.py): - -```python -from openhands.sdk.llm import LLM, LLMRegistry - -registry = LLMRegistry() -registry.add(LLM(model="anthropic/claude-sonnet-4.1", service_id="primary")) -registry.add(LLM(model="openai/gpt-4o-mini", service_id="summaries")) - -code_llm = registry.get("primary") -summarizer = registry.get("summaries") -``` - -Registries ensure that agents launched by different parts of your application -reuse the same configuration objects, making retries and telemetry consistent. - ## Relationship with OpenHands interfaces The OpenHands repository (UI/CLI) now consumes these SDK APIs. When you adjust