Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions solutions/search/agent-builder/get-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,6 +88,12 @@ Learn more in [Agent Chat](chat.md).

::::

::::{step} Configure model (optional)

By default, {{agent-builder}} uses the Elastic Managed LLM. To use a different model, refer to [model selection and configuration](models.md).

::::

::::{step} Begin building agents and tools

Once you've tested the default **Elastic AI Agent** with the [built-in Elastic tools](tools.md), you can begin [building your own agents](agent-builder-agents.md#create-a-new-agent) with custom instructions and [creating your own tools](tools.md#create-custom-tools) to assign them.
Expand Down
10 changes: 2 additions & 8 deletions solutions/search/agent-builder/limitations-known-issues.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,17 +20,11 @@ These pages are currently hidden from the docs TOC and have `noindexed` meta hea

{{agent-builder}} must be enabled for non-serverless deployments {applies_to}`stack: preview 9.2`. Refer to [Get started](get-started.md#enable-agent-builder) for instructions.

### Model selection

Initially, {{agent-builder}} only supports working with the [Elastic Managed LLM](kibana://reference/connectors-kibana/elastic-managed-llm.md) running on the [Elastic Inference Service](/explore-analyze/elastic-inference/eis.md) {applies_to}`stack: preview 9.2`.

Learn about [pricing](https://www.elastic.co/pricing/serverless-search) for the Elastic Managed LLM.

## Known issues

### Incompatible LLMs

While Elastic offers LLM [connectors](kibana://reference/connectors-kibana.md) for many different vendors and models, not all LLMs are robust enough to be used with {{agent-builder}}. We recommend using the [Elastic Managed LLM](kibana://reference/connectors-kibana/elastic-managed-llm.md) (the default).
While Elastic offers LLM [connectors](kibana://reference/connectors-kibana.md) for many different vendors and models, not all LLMs are robust enough to be used with {{agent-builder}}. We recommend using the [Elastic Managed LLM](kibana://reference/connectors-kibana/elastic-managed-llm.md) (the default). Learn more in [](models.md).

The following errors suggest your selected model may not be compatible with {{agent-builder}}:

Expand Down Expand Up @@ -69,4 +63,4 @@ This results in parsing errors like this:
]
```



85 changes: 84 additions & 1 deletion solutions/search/agent-builder/models.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,87 @@ These pages are currently hidden from the docs TOC and have `noindexed` meta hea
**Go to the docs [landing page](/solutions/search/elastic-agent-builder.md).**
:::

# Using different models in {{agent-builder}}
# Using different models in {{agent-builder}}

{{agent-builder}} uses large language models (LLMs) to power agent reasoning and decision-making. By default, agents use the Elastic Managed LLM, but you can configure other models through Kibana connectors.

## Default model configuration

By default, {{agent-builder}} uses the Elastic Managed LLM connector running on the [Elastic Inference Service](/explore-analyze/elastic-inference/eis.md) {applies_to}`serverless: preview` {applies_to}`ess: preview 9.2`.

This managed service requires zero setup and no additional API key management.

Learn more about the [Elastic Managed LLM connector](kibana://reference/connectors-kibana/elastic-managed-llm.md) and [pricing](https://www.elastic.co/pricing).

## Change the default model

By default, {{agent-builder}} uses the Elastic Managed LLM. To use a different model, select a configured connector and set it as the default.

### Use a pre-configured connector

1. Search for **GenAI Settings** in the global search field
2. Select your preferred connector from the **Default AI Connector** dropdown
3. Save your changes

### Create a new connector in the UI

1. Find connectors under **Alerts and Insights / Connectors** in the [global search bar](/explore-analyze/find-and-organize/find-apps-and-objects.md)
2. Select **Create Connector** and select your model provider
3. Configure the connector with your API credentials and preferred model
4. Search for **GenAI Settings** in the global search field
5. Select your new connector from the **Default AI Connector** dropdown under **Custom connectors**
6. Save your changes

For detailed instructions on creating connectors, refer to [Connectors](https://www.elastic.co/docs/deploy-manage/manage-connectors).

Learn more about [preconfigured connectors](https://www.elastic.co/docs/reference/kibana/connectors-kibana/pre-configured-connectors).

#### Connect a local LLM

You can connect a locally hosted LLM to Elastic using the OpenAI connector. This requires your local LLM to be compatible with the OpenAI API format.

Refer to the [OpenAI connector documentation](kibana://reference/connectors-kibana/openai-action-type.md) for detailed setup instructions.

## Connectors API

For programmatic access to connector management, refer to the [Connectors API documentation]({{kib-serverless-apis}}group/endpoint-connectors).

## Recommended models
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about we combine this section with the subsequent one? The opening line can be the one in this section followed by the list of the families.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

absolutely, no need for both headings 👍


{{agent-builder}} requires models with strong reasoning and tool-calling capabilities. State-of-the-art models perform significantly better than smaller or older models.

The following models are known to work well with {{agent-builder}}:

- **OpenAI**: GPT-4.1, GPT-4o
- **Anthropic**: Claude Sonnet 4.5, Claude Sonnet 4, Claude Sonnet 3.7
- **Google**: Gemini 2.5 Pro

### Why model quality matters

Agent Builder relies on advanced LLM capabilities including:

- **Function calling**: Models must accurately select appropriate tools and construct valid parameters from natural language requests
- **Multi-step reasoning**: Agents need to plan, execute, and adapt based on tool results across multiple iterations
- **Structured output**: Models must produce properly formatted responses that the agent framework can parse

Smaller or less capable models may produce errors like:

```console-response
Error: Invalid function call syntax
```

```console-response
Error executing agent: No tool calls found in the response.
```

While any chat-completion-compatible connector can technically be configured, we strongly recommend using state-of-the-art models for reliable agent performance.

:::{note}
GPT-4o-mini and similar smaller models are not recommended for {{agent-builder}} as they lack the necessary capabilities for reliable agent workflows.
:::

## Related resources

- [Limitations and known issues](limitations-known-issues.md): Current limitations around model selection
- [Get started](get-started.md): Initial setup and configuration
- [Connectors](/deploy-manage/manage-connectors.md): Detailed connector configuration guide
7 changes: 7 additions & 0 deletions solutions/search/elastic-agent-builder.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,13 @@ To get started you need an Elastic deployment and you must enable the feature.

[**Get started with {{agent-builder}}**](agent-builder/get-started.md)

## Model selection

By default, agents use the Elastic Managed LLM, but you can configure other model providers using connectors, including local LLMs deployed on your infrastructure.

[**Learn more about model selection**](agent-builder/models.md)


## Programmatic interfaces

{{agent-builder}} provides APIs and LLM integration options for programmatic access and automation.
Expand Down