diff --git a/docs.json b/docs.json
index dd93bf1c..9435aeab 100644
--- a/docs.json
+++ b/docs.json
@@ -121,6 +121,7 @@
{
"group": "OpenHands Settings",
"pages": [
+ "openhands/usage/settings/llm-settings",
"openhands/usage/settings/secrets-settings",
"openhands/usage/settings/mcp-settings"
]
diff --git a/openhands/usage/advanced/search-engine-setup.mdx b/openhands/usage/advanced/search-engine-setup.mdx
index d8c7890d..a9e4c9f6 100644
--- a/openhands/usage/advanced/search-engine-setup.mdx
+++ b/openhands/usage/advanced/search-engine-setup.mdx
@@ -28,12 +28,13 @@ Once you have your Tavily API key, you can configure OpenHands to use it:
#### In the OpenHands UI
-1. Open OpenHands and navigate to the Settings page.
-2. Under the `LLM` tab, enter your Tavily API key (starting with `tvly-`) in the `Search API Key (Tavily)` field.
+1. Open OpenHands and navigate to the `Settings > LLM` page.
+2. Enter your Tavily API key (starting with `tvly-`) in the `Search API Key (Tavily)` field.
3. Click `Save` to apply the changes.
- The search API key field is optional. If you don't provide a key, the search functionality will not be available to the agent.
+ The search API key field is optional. If you don't provide a key, the search functionality will not be available to
+ the agent.
#### Using Configuration Files
diff --git a/openhands/usage/cloud/pro-subscription.mdx b/openhands/usage/cloud/pro-subscription.mdx
index 2c4876a6..6027f181 100644
--- a/openhands/usage/cloud/pro-subscription.mdx
+++ b/openhands/usage/cloud/pro-subscription.mdx
@@ -1,11 +1,11 @@
---
title: "Pro Subscription"
-description: "Learn about OpenHands Cloud Pro Subscription features and pricing."
+description: "Learn about OpenHands Cloud Pro subscription features and pricing."
---
## Overview
-The OpenHands Pro Subscription unlocks additional features and better pricing when you run OpenHands conversations in
+The OpenHands Pro subscription unlocks additional features and better pricing when you run OpenHands conversations in
OpenHands Cloud.
## Base Features
diff --git a/openhands/usage/llms/llms.mdx b/openhands/usage/llms/llms.mdx
index 7989b528..b92bf49b 100644
--- a/openhands/usage/llms/llms.mdx
+++ b/openhands/usage/llms/llms.mdx
@@ -72,6 +72,8 @@ using `-e`:
- `LLM_DISABLE_VISION`
- `LLM_CACHING_PROMPT`
+## LLM Provider Guides
+
We have a few guides for running OpenHands with specific model providers:
- [Azure](/openhands/usage/llms/azure-llms)
diff --git a/openhands/usage/settings/llm-settings.mdx b/openhands/usage/settings/llm-settings.mdx
new file mode 100644
index 00000000..2210aba2
--- /dev/null
+++ b/openhands/usage/settings/llm-settings.mdx
@@ -0,0 +1,65 @@
+---
+title: Language Model (LLM) Settings
+description: This page goes over how to set the LLM to use in OpenHands. As well as some additional LLM settings.
+---
+
+
+ In [OpenHands Cloud](/openhands/usage/cloud/openhands-cloud), this tab is only available to
+ [Pro subscription](/openhands/usage/cloud/pro-subscription) users.
+
+
+## Overview
+
+The LLM settings allows you to bring your own LLM and API key to use with OpenHands. This can be any model that is
+supported by litellm, but it requires a powerful model to work properly.
+[See our recommended models here](/openhands/usage/llms/llms#model-recommendations). You can also configure some
+additional LLM settings on this page.
+
+## Basic LLM Settings
+
+The most popular providers and models are available in the basic settings. Some of the providers have been verified to
+work with OpenHands such as the [OpenHands provider](/openhands/usage/llms/openhands-llms), Anthropic, OpenAI and
+Mistral AI.
+
+
+
+ Choose your preferred provider using the `LLM Provider` dropdown.
+
+
+ Choose your favorite model using the `LLM Model` dropdown.
+
+
+ Set the API key for your chosen provider and model and click `Save Changes`.
+
+
+
+This will set the LLM for all new conversations. If you want to use this new LLM for older conversations, you must first
+restart older conversations.
+
+## Advanced LLM Settings
+
+Toggling the `Advanced` settings, allows you to set custom models as well as some additional LLM settings. You can use
+this when your preferred provider or model does not exist in the basic settings dropdowns.
+
+
+
+ Set your custom model, with the provider as the prefix. For information on how to specify the custom model,
+ follow [the specific provider docs on litellm](https://docs.litellm.ai/docs/providers). We also have
+ [some guides for popular providers](/openhands/usage/llms/llms#llm-provider-guides).
+
+
+ If your provider has a specific base URL, specify it here.
+
+
+ Set the API key for your custom model and click `Save Changes`.
+
+
+
+### Memory Condensation
+
+The memory condenser manages the language model's context by ensuring only the most important and relevant information
+is presented. Keeping the context focused improves latency and reduces token consumption, especially in long-running
+conversations.
+
+- `Enable memory condensation` - Turn on this setting to activate this feature.
+- `Memory condenser max history size` - The condenser will summarize the history after this many events.