From 1c56d57686bae473668ff63c3f1a185ea7efa087 Mon Sep 17 00:00:00 2001 From: Liam Thompson <32779855+leemthompo@users.noreply.github.com> Date: Mon, 6 Oct 2025 15:14:06 +0200 Subject: [PATCH 1/9] [Agent Builder] Add page about models --- solutions/search/agent-builder/get-started.md | 6 + .../agent-builder/limitations-known-issues.md | 4 +- solutions/search/agent-builder/models.md | 154 ++++++++++++++++++ solutions/search/elastic-agent-builder.md | 7 + solutions/toc.yml | 1 + 5 files changed, 170 insertions(+), 2 deletions(-) create mode 100644 solutions/search/agent-builder/models.md diff --git a/solutions/search/agent-builder/get-started.md b/solutions/search/agent-builder/get-started.md index 56272f2636..b26cf80bf5 100644 --- a/solutions/search/agent-builder/get-started.md +++ b/solutions/search/agent-builder/get-started.md @@ -75,6 +75,12 @@ Learn more in [Agent Chat](chat.md). :::: +::::{step} Configure model (optional) + +By default, {{agent-builder}} uses the Elastic Managed LLM. To use a different model, refer to [model selection and configuration](models.md). + +:::: + ::::{step} Begin building agents and tools Once you've tested the default **Elastic AI Agent** with the [built-in Elastic tools](tools.md), you can begin [building your own agents](agent-builder-agents.md#create-a-new-agent) with custom instructions and [creating your own tools](tools.md#create-custom-tools) to assign them. diff --git a/solutions/search/agent-builder/limitations-known-issues.md b/solutions/search/agent-builder/limitations-known-issues.md index 0d509ea9bf..357b5ef2a7 100644 --- a/solutions/search/agent-builder/limitations-known-issues.md +++ b/solutions/search/agent-builder/limitations-known-issues.md @@ -22,9 +22,9 @@ While in private technical preview, {{agent-builder}} is not enabled by default. ### Model selection -Initially, {{agent-builder}} only supports working with the [Elastic Managed LLM](kibana://reference/connectors-kibana/elastic-managed-llm.md) running on the [Elastic Inference Service](/explore-analyze/elastic-inference/eis.md) {applies_to}`serverless: preview` {applies_to}`ess: preview 9.2`. +Initially, {{agent-builder}} defaults to working with the [Elastic Managed LLM](kibana://reference/connectors-kibana/elastic-managed-llm.md) running on the [Elastic Inference Service](/explore-analyze/elastic-inference/eis.md) {applies_to}`serverless: preview` {applies_to}`ess: preview 9.2`. -Learn about [pricing](https://www.elastic.co/pricing/serverless-search) for the Elastic Managed LLM. +Learn more on the [models page](models.md). ## Known issues diff --git a/solutions/search/agent-builder/models.md b/solutions/search/agent-builder/models.md new file mode 100644 index 0000000000..b794284c7f --- /dev/null +++ b/solutions/search/agent-builder/models.md @@ -0,0 +1,154 @@ +--- +navigation_title: "Use different models" +applies_to: + stack: preview 9.2 + serverless: + elasticsearch: preview +--- + +:::{warning} +These pages are currently hidden from the docs TOC and have `noindexed` meta headers. + +**Go to the docs [landing page](/solutions/search/elastic-agent-builder.md).** +::: + +# Using different models in {{agent-builder}} + +{{agent-builder}} uses large language models (LLMs) to power agent reasoning and decision-making. By default, agents use the Elastic Managed LLM, but you can configure other models through Kibana connectors. + +## Default model configuration + +By default, {{agent-builder}} uses the Elastic Managed LLM connector running on the [Elastic Inference Service](/explore-analyze/elastic-inference/eis.md) {applies_to}`serverless: preview` {applies_to}`ess: preview 9.2`. + +This managed service requires zero setup and no additional API key management. + +Learn more about the [Elastic Managed LLM connector](kibana://reference/connectors-kibana/elastic-managed-llm) and [pricing](https://www.elastic.co/pricing). + +## Change the default model + +By default, {{agent-builder}} uses the Elastic Managed LLM. To use a different model, you'll need a configured connector and then set it as the default. + +### Use a pre-configured connector + +1. Search for **GenAI Settings** in the global search field +2. Select your preferred connector from the **Default AI Connector** dropdown +3. Save your changes + +### Create a new connector in the UI + +1. Find connectors under **Alerts and Insights / Connectors** in the [global search bar](/explore-analyze/find-and-organize/find-apps-and-objects.md) +2. Select **Create Connector** and select your model provider +3. Configure the connector with your API credentials and preferred model +4. Search for **GenAI Settings** in the global search field +5. Select your new connector from the **Default AI Connector** dropdown +6. Save your changes + +For detailed instructions on creating connectors, refer to [Connectors](https://www.elastic.co/docs/deploy-manage/manage-connectors). + +Learn more about [preconfigured connectors](https://www.elastic.co/docs/reference/kibana/connectors-kibana/pre-configured-connectors). + +## Connectors API + +For programmatic access to connector management, refer to the [Connectors API documentation]({{kib-serverless-apis}}group/endpoint-connectors). + +## Recommended models + +{{agent-builder}} requires models with strong reasoning and tool-calling capabilities. State-of-the-art models perform significantly better than smaller or older models. + +### Recommended model families + +The following models are known to work well with {{agent-builder}}: + +- **OpenAI**: GPT-4.1, GPT-4o +- **Anthropic**: Claude Sonnet 4.5, Claude Sonnet 4, Claude Sonnet 3.7 +- **Google**: Gemini 2.5 Pro + +### Why model quality matters + +Agent Builder relies on advanced LLM capabilities including: + +- **Function calling**: Models must accurately select appropriate tools and construct valid parameters from natural language requests +- **Multi-step reasoning**: Agents need to plan, execute, and adapt based on tool results across multiple iterations +- **Structured output**: Models must produce properly formatted responses that the agent framework can parse + +Smaller or less capable models may produce errors like: + +```console-response +Error: Invalid function call syntax +``` + +```console-response +Error executing agent: No tool calls found in the response. +``` + +While any chat-completion-compatible connector can technically be configured, we strongly recommend using state-of-the-art models for reliable agent performance. + +:::{note} +GPT-4o-mini and similar smaller models are not recommended for {{agent-builder}} as they lack the necessary capabilities for reliable agent workflows. +::: + +## Connect a local LLM + +You can connect a locally hosted LLM to Elastic using the OpenAI connector. This requires your local LLM to be compatible with the OpenAI API format. + +### Requirements + +**Model selection:** +- Must include "instruct" in the model name to work with Elastic +- Download from trusted sources only +- Consider parameter size, context window, and quantization format for your needs + +**Integration setup:** +- For Elastic Cloud: Requires a reverse proxy (such as Nginx) to authenticate requests using a bearer token and forward them to your local LLM endpoint +- For self-managed deployments on the same host as your LLM: Can connect directly without a reverse proxy +- Your local LLM server must use the OpenAI SDK for API compatibility + +### Configure the connector + +:::::{stepper} +::::{step} Set up your local LLM server + +Ensure your local LLM is running and accessible via an OpenAI-compatible API endpoint. + +:::: + +::::{step} Create the OpenAI connector + +1. Log in to your Elastic deployment +2. Find connectors under **Alerts and Insights / Connectors** in the [global search bar](/explore-analyze/find-and-organize/find-apps-and-objects.md) +3. Select **Create Connector** and select **OpenAI** +4. Name your connector to help track the model version you're using +5. Under **Select an OpenAI provider**, select **Other (OpenAI Compatible Service)** + +:::: + +::::{step} Configure connection details + +1. Under **URL**, enter: + - For Elastic Cloud: Your reverse proxy domain + `/v1/chat/completions` + - For same-host self-managed: `http://localhost:1234/v1/chat/completions` (adjust port as needed) +2. Under **Default model**, enter `local-model` +3. Under **API key**, enter: + - For Elastic Cloud: Your reverse proxy authentication token + - For same-host self-managed: Your LLM server's API key +4. Select **Save** + +:::: + +::::{step} Set as default (optional) + +To use your local model as the default for {{agent-builder}}: + +1. Search for **GenAI Settings** in the global search field +2. Select your local LLM connector from the **Default AI Connector** dropdown +3. Save your changes + +:::: + +::::: + +## Related pages + +- [Limitations and known issues](limitations-known-issues.md): Current limitations around model selection +- [Get started](get-started.md): Initial setup and configuration +- [Connectors](/deploy-manage/manage-connectors): Detailed connector configuration guide \ No newline at end of file diff --git a/solutions/search/elastic-agent-builder.md b/solutions/search/elastic-agent-builder.md index 4a3534cd71..dad2e18e2d 100644 --- a/solutions/search/elastic-agent-builder.md +++ b/solutions/search/elastic-agent-builder.md @@ -54,6 +54,13 @@ To get started you need an Elastic deployment and you must enable the feature. [**Get started with {{agent-builder}}**](agent-builder/get-started.md) +## Model selection + +By default, agents use the Elastic Managed LLM, but you can configure other model providers using connectors, including local LLMs deployed on your infrastructure. + +[**Learn more about model selection**](agent-builder/models.md) + + ## Programmatic interfaces {{agent-builder}} provides APIs and LLM integration options for programmatic access and automation. diff --git a/solutions/toc.yml b/solutions/toc.yml index 7c3c9ad6af..8eeb53cbf3 100644 --- a/solutions/toc.yml +++ b/solutions/toc.yml @@ -46,6 +46,7 @@ toc: - file: search/using-openai-compatible-models.md - hidden: search/elastic-agent-builder.md - hidden: search/agent-builder/get-started.md + - hidden: search/agent-builder/models.md - hidden: search/agent-builder/chat.md - hidden: search/agent-builder/agent-builder-agents.md - hidden: search/agent-builder/tools.md From b3b02c7a21fd56d3ff55dcdc2ac76c929f4cd7cb Mon Sep 17 00:00:00 2001 From: Liam Thompson <32779855+leemthompo@users.noreply.github.com> Date: Mon, 6 Oct 2025 15:25:01 +0200 Subject: [PATCH 2/9] fix links --- solutions/search/agent-builder/models.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/solutions/search/agent-builder/models.md b/solutions/search/agent-builder/models.md index b794284c7f..3468d3092c 100644 --- a/solutions/search/agent-builder/models.md +++ b/solutions/search/agent-builder/models.md @@ -22,7 +22,7 @@ By default, {{agent-builder}} uses the Elastic Managed LLM connector running on This managed service requires zero setup and no additional API key management. -Learn more about the [Elastic Managed LLM connector](kibana://reference/connectors-kibana/elastic-managed-llm) and [pricing](https://www.elastic.co/pricing). +Learn more about the [Elastic Managed LLM connector](kibana://reference/connectors-kibana/elastic-managed-llm.md) and [pricing](https://www.elastic.co/pricing). ## Change the default model @@ -151,4 +151,4 @@ To use your local model as the default for {{agent-builder}}: - [Limitations and known issues](limitations-known-issues.md): Current limitations around model selection - [Get started](get-started.md): Initial setup and configuration -- [Connectors](/deploy-manage/manage-connectors): Detailed connector configuration guide \ No newline at end of file +- [Connectors](/deploy-manage/manage-connectors.md): Detailed connector configuration guide \ No newline at end of file From 7369a3aa5d5b4a5b3fd587f55ccee779503633f9 Mon Sep 17 00:00:00 2001 From: Liam Thompson <32779855+leemthompo@users.noreply.github.com> Date: Mon, 6 Oct 2025 16:59:26 +0200 Subject: [PATCH 3/9] clarify instruct variant guidance --- solutions/search/agent-builder/models.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/solutions/search/agent-builder/models.md b/solutions/search/agent-builder/models.md index 3468d3092c..8154629800 100644 --- a/solutions/search/agent-builder/models.md +++ b/solutions/search/agent-builder/models.md @@ -94,9 +94,9 @@ You can connect a locally hosted LLM to Elastic using the OpenAI connector. This ### Requirements **Model selection:** -- Must include "instruct" in the model name to work with Elastic - Download from trusted sources only - Consider parameter size, context window, and quantization format for your needs +- Prefer "instruct" variants over "base" or "chat" versions when multiple variants are available, as instruct models are typically better tuned for following instructions **Integration setup:** - For Elastic Cloud: Requires a reverse proxy (such as Nginx) to authenticate requests using a bearer token and forward them to your local LLM endpoint From 1353c3eefadca58231f382d760084be8c1ab487e Mon Sep 17 00:00:00 2001 From: Liam Thompson <32779855+leemthompo@users.noreply.github.com> Date: Wed, 15 Oct 2025 15:27:35 +0200 Subject: [PATCH 4/9] Factor out local LLM instructions --- solutions/search/agent-builder/models.md | 58 +----------------------- 1 file changed, 2 insertions(+), 56 deletions(-) diff --git a/solutions/search/agent-builder/models.md b/solutions/search/agent-builder/models.md index 194f27b25b..f2c950dee7 100644 --- a/solutions/search/agent-builder/models.md +++ b/solutions/search/agent-builder/models.md @@ -26,7 +26,7 @@ Learn more about the [Elastic Managed LLM connector](kibana://reference/connecto ## Change the default model -By default, {{agent-builder}} uses the Elastic Managed LLM. To use a different model, you'll need a configured connector and then set it as the default. +By default, {{agent-builder}} uses the Elastic Managed LLM. To use a different model, select a configured connector and set it as the default. ### Use a pre-configured connector @@ -91,61 +91,7 @@ GPT-4o-mini and similar smaller models are not recommended for {{agent-builder}} You can connect a locally hosted LLM to Elastic using the OpenAI connector. This requires your local LLM to be compatible with the OpenAI API format. -### Requirements - -**Model selection:** -- Download from trusted sources only -- Consider parameter size, context window, and quantization format for your needs -- Prefer "instruct" variants over "base" or "chat" versions when multiple variants are available, as instruct models are typically better tuned for following instructions - -**Integration setup:** -- For Elastic Cloud: Requires a reverse proxy (such as Nginx) to authenticate requests using a bearer token and forward them to your local LLM endpoint -- For self-managed deployments on the same host as your LLM: Can connect directly without a reverse proxy -- Your local LLM server must use the OpenAI SDK for API compatibility - -### Configure the connector - -:::::{stepper} -::::{step} Set up your local LLM server - -Ensure your local LLM is running and accessible via an OpenAI-compatible API endpoint. - -:::: - -::::{step} Create the OpenAI connector - -1. Log in to your Elastic deployment -2. Find connectors under **Alerts and Insights / Connectors** in the [global search bar](/explore-analyze/find-and-organize/find-apps-and-objects.md) -3. Select **Create Connector** and select **OpenAI** -4. Name your connector to help track the model version you're using -5. Under **Select an OpenAI provider**, select **Other (OpenAI Compatible Service)** - -:::: - -::::{step} Configure connection details - -1. Under **URL**, enter: - - For Elastic Cloud: Your reverse proxy domain + `/v1/chat/completions` - - For same-host self-managed: `http://localhost:1234/v1/chat/completions` (adjust port as needed) -2. Under **Default model**, enter `local-model` -3. Under **API key**, enter: - - For Elastic Cloud: Your reverse proxy authentication token - - For same-host self-managed: Your LLM server's API key -4. Select **Save** - -:::: - -::::{step} Set as default (optional) - -To use your local model as the default for {{agent-builder}}: - -1. Search for **GenAI Settings** in the global search field -2. Select your local LLM connector from the **Default AI Connector** dropdown -3. Save your changes - -:::: - -::::: +Refer to the [OpenAI connector documentation](kibana://reference/connectors-kibana/openai.md) for detailed setup instructions. ## Related pages From a0d4ed4e5c0a759de10bb57828ec6a2ebe067ec6 Mon Sep 17 00:00:00 2001 From: Liam Thompson Date: Thu, 16 Oct 2025 11:59:36 +0200 Subject: [PATCH 5/9] Fix OpenAI connector link --- solutions/search/agent-builder/models.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/solutions/search/agent-builder/models.md b/solutions/search/agent-builder/models.md index f2c950dee7..06e45d8baa 100644 --- a/solutions/search/agent-builder/models.md +++ b/solutions/search/agent-builder/models.md @@ -91,7 +91,7 @@ GPT-4o-mini and similar smaller models are not recommended for {{agent-builder}} You can connect a locally hosted LLM to Elastic using the OpenAI connector. This requires your local LLM to be compatible with the OpenAI API format. -Refer to the [OpenAI connector documentation](kibana://reference/connectors-kibana/openai.md) for detailed setup instructions. +Refer to the [OpenAI connector documentation](kibana://reference/connectors-kibana/openai-action-type.md) for detailed setup instructions. ## Related pages From cf6c1e6f824c58a00f04079000c7ad14a6b657f5 Mon Sep 17 00:00:00 2001 From: Liam Thompson <32779855+leemthompo@users.noreply.github.com> Date: Mon, 20 Oct 2025 12:05:08 +0200 Subject: [PATCH 6/9] clarify dropdown --- solutions/search/agent-builder/models.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/solutions/search/agent-builder/models.md b/solutions/search/agent-builder/models.md index 06e45d8baa..736fd808ac 100644 --- a/solutions/search/agent-builder/models.md +++ b/solutions/search/agent-builder/models.md @@ -40,7 +40,7 @@ By default, {{agent-builder}} uses the Elastic Managed LLM. To use a different m 2. Select **Create Connector** and select your model provider 3. Configure the connector with your API credentials and preferred model 4. Search for **GenAI Settings** in the global search field -5. Select your new connector from the **Default AI Connector** dropdown +5. Select your new connector from the **Default AI Connector** dropdown under **Custom connectors** 6. Save your changes For detailed instructions on creating connectors, refer to [Connectors](https://www.elastic.co/docs/deploy-manage/manage-connectors). From f2b4f657ce9a82154e868c27fe10e720bcf22fcb Mon Sep 17 00:00:00 2001 From: Liam Thompson <32779855+leemthompo@users.noreply.github.com> Date: Mon, 20 Oct 2025 12:29:41 +0200 Subject: [PATCH 7/9] move local LLM section --- solutions/search/agent-builder/models.md | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/solutions/search/agent-builder/models.md b/solutions/search/agent-builder/models.md index 736fd808ac..06d386384a 100644 --- a/solutions/search/agent-builder/models.md +++ b/solutions/search/agent-builder/models.md @@ -47,6 +47,12 @@ For detailed instructions on creating connectors, refer to [Connectors](https:// Learn more about [preconfigured connectors](https://www.elastic.co/docs/reference/kibana/connectors-kibana/pre-configured-connectors). +#### Connect a local LLM + +You can connect a locally hosted LLM to Elastic using the OpenAI connector. This requires your local LLM to be compatible with the OpenAI API format. + +Refer to the [OpenAI connector documentation](kibana://reference/connectors-kibana/openai-action-type.md) for detailed setup instructions. + ## Connectors API For programmatic access to connector management, refer to the [Connectors API documentation]({{kib-serverless-apis}}group/endpoint-connectors). @@ -87,12 +93,6 @@ While any chat-completion-compatible connector can technically be configured, we GPT-4o-mini and similar smaller models are not recommended for {{agent-builder}} as they lack the necessary capabilities for reliable agent workflows. ::: -## Connect a local LLM - -You can connect a locally hosted LLM to Elastic using the OpenAI connector. This requires your local LLM to be compatible with the OpenAI API format. - -Refer to the [OpenAI connector documentation](kibana://reference/connectors-kibana/openai-action-type.md) for detailed setup instructions. - ## Related pages - [Limitations and known issues](limitations-known-issues.md): Current limitations around model selection From f1466ef2438e42d242401b211ad2e288d706bcd5 Mon Sep 17 00:00:00 2001 From: Liam Thompson Date: Mon, 20 Oct 2025 15:20:47 +0200 Subject: [PATCH 8/9] Apply suggestions --- solutions/search/agent-builder/models.md | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/solutions/search/agent-builder/models.md b/solutions/search/agent-builder/models.md index 06d386384a..ff31cc3b6a 100644 --- a/solutions/search/agent-builder/models.md +++ b/solutions/search/agent-builder/models.md @@ -61,8 +61,6 @@ For programmatic access to connector management, refer to the [Connectors API do {{agent-builder}} requires models with strong reasoning and tool-calling capabilities. State-of-the-art models perform significantly better than smaller or older models. -### Recommended model families - The following models are known to work well with {{agent-builder}}: - **OpenAI**: GPT-4.1, GPT-4o @@ -93,7 +91,7 @@ While any chat-completion-compatible connector can technically be configured, we GPT-4o-mini and similar smaller models are not recommended for {{agent-builder}} as they lack the necessary capabilities for reliable agent workflows. ::: -## Related pages +## Related resources - [Limitations and known issues](limitations-known-issues.md): Current limitations around model selection - [Get started](get-started.md): Initial setup and configuration From a370f1812c10c5fc25ced10f905cbda167f61575 Mon Sep 17 00:00:00 2001 From: Liam Thompson Date: Mon, 20 Oct 2025 15:21:44 +0200 Subject: [PATCH 9/9] Update limitations-known-issues.md for clarity Removed model selection section and added a reference to the models page for more information. --- .../search/agent-builder/limitations-known-issues.md | 10 ++-------- 1 file changed, 2 insertions(+), 8 deletions(-) diff --git a/solutions/search/agent-builder/limitations-known-issues.md b/solutions/search/agent-builder/limitations-known-issues.md index 6b29016e1b..39ef117c0c 100644 --- a/solutions/search/agent-builder/limitations-known-issues.md +++ b/solutions/search/agent-builder/limitations-known-issues.md @@ -20,17 +20,11 @@ These pages are currently hidden from the docs TOC and have `noindexed` meta hea While in private technical preview, {{agent-builder}} is not enabled by default. Refer to [Get started](get-started.md#enable-agent-builder) for instructions. -### Model selection - -Initially, {{agent-builder}} defaults to working with the [Elastic Managed LLM](kibana://reference/connectors-kibana/elastic-managed-llm.md) running on the [Elastic Inference Service](/explore-analyze/elastic-inference/eis.md) {applies_to}`serverless: preview` {applies_to}`ess: preview 9.2`. - -Learn more on the [models page](models.md). - ## Known issues ### Incompatible LLMs -While Elastic offers LLM [connectors](kibana://reference/connectors-kibana.md) for many different vendors and models, not all LLMs are robust enough to be used with {{agent-builder}}. We recommend using the [Elastic Managed LLM](kibana://reference/connectors-kibana/elastic-managed-llm.md) (the default). +While Elastic offers LLM [connectors](kibana://reference/connectors-kibana.md) for many different vendors and models, not all LLMs are robust enough to be used with {{agent-builder}}. We recommend using the [Elastic Managed LLM](kibana://reference/connectors-kibana/elastic-managed-llm.md) (the default). Learn more in [](models.md). The following errors suggest your selected model may not be compatible with {{agent-builder}}: @@ -67,4 +61,4 @@ This results in parsing errors like this: ] ``` - \ No newline at end of file +