Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs.json
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,7 @@
{
"group": "OpenHands Settings",
"pages": [
"openhands/usage/settings/llm-settings",
"openhands/usage/settings/secrets-settings",
"openhands/usage/settings/mcp-settings"
]
Expand Down
7 changes: 4 additions & 3 deletions openhands/usage/advanced/search-engine-setup.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -28,12 +28,13 @@ Once you have your Tavily API key, you can configure OpenHands to use it:

#### In the OpenHands UI

1. Open OpenHands and navigate to the Settings page.
2. Under the `LLM` tab, enter your Tavily API key (starting with `tvly-`) in the `Search API Key (Tavily)` field.
1. Open OpenHands and navigate to the `Settings > LLM` page.
2. Enter your Tavily API key (starting with `tvly-`) in the `Search API Key (Tavily)` field.
3. Click `Save` to apply the changes.

<Note>
The search API key field is optional. If you don't provide a key, the search functionality will not be available to the agent.
The search API key field is optional. If you don't provide a key, the search functionality will not be available to
the agent.
</Note>

#### Using Configuration Files
Expand Down
4 changes: 2 additions & 2 deletions openhands/usage/cloud/pro-subscription.mdx
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
---
title: "Pro Subscription"
description: "Learn about OpenHands Cloud Pro Subscription features and pricing."
description: "Learn about OpenHands Cloud Pro subscription features and pricing."
---

## Overview

The OpenHands Pro Subscription unlocks additional features and better pricing when you run OpenHands conversations in
The OpenHands Pro subscription unlocks additional features and better pricing when you run OpenHands conversations in
OpenHands Cloud.

## Base Features
Expand Down
2 changes: 2 additions & 0 deletions openhands/usage/llms/llms.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,8 @@ using `-e`:
- `LLM_DISABLE_VISION`
- `LLM_CACHING_PROMPT`

## LLM Provider Guides

We have a few guides for running OpenHands with specific model providers:

- [Azure](/openhands/usage/llms/azure-llms)
Expand Down
65 changes: 65 additions & 0 deletions openhands/usage/settings/llm-settings.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
---
title: Language Model (LLM) Settings
description: This page goes over how to set the LLM to use in OpenHands. As well as some additional LLM settings.
---

<Note>
In [OpenHands Cloud](/openhands/usage/cloud/openhands-cloud), this tab is only available to
[Pro subscription](/openhands/usage/cloud/pro-subscription) users.
</Note>

## Overview

The LLM settings allows you to bring your own LLM and API key to use with OpenHands. This can be any model that is
supported by litellm, but it requires a powerful model to work properly.
[See our recommended models here](/openhands/usage/llms/llms#model-recommendations). You can also configure some
additional LLM settings on this page.

## Basic LLM Settings

The most popular providers and models are available in the basic settings. Some of the providers have been verified to
work with OpenHands such as the [OpenHands provider](/openhands/usage/llms/openhands-llms), Anthropic, OpenAI and
Mistral AI.

<Steps>
<Step title="Select Provider">
Choose your preferred provider using the `LLM Provider` dropdown.
</Step>
<Step title="Select Model">
Choose your favorite model using the `LLM Model` dropdown.
</Step>
<Step title="Set API Key">
Set the API key for your chosen provider and model and click `Save Changes`.
</Step>
</Steps>

This will set the LLM for all new conversations. If you want to use this new LLM for older conversations, you must first
restart older conversations.

## Advanced LLM Settings

Toggling the `Advanced` settings, allows you to set custom models as well as some additional LLM settings. You can use
this when your preferred provider or model does not exist in the basic settings dropdowns.

<Steps>
<Step title="Set Custom Model">
Set your custom model, with the provider as the prefix. For information on how to specify the custom model,
follow [the specific provider docs on litellm](https://docs.litellm.ai/docs/providers). We also have
[some guides for popular providers](/openhands/usage/llms/llms#llm-provider-guides).
</Step>
<Step title="Set Base URL (Optional)">
If your provider has a specific base URL, specify it here.
</Step>
<Step title="Set API Key">
Set the API key for your custom model and click `Save Changes`.
</Step>
</Steps>

### Memory Condensation

The memory condenser manages the language model's context by ensuring only the most important and relevant information
is presented. Keeping the context focused improves latency and reduces token consumption, especially in long-running
conversations.

- `Enable memory condensation` - Turn on this setting to activate this feature.
- `Memory condenser max history size` - The condenser will summarize the history after this many events.