diff --git a/openhands/usage/run-openhands/cli-mode.mdx b/openhands/usage/run-openhands/cli-mode.mdx
index 69c6df3b..a936b35b 100644
--- a/openhands/usage/run-openhands/cli-mode.mdx
+++ b/openhands/usage/run-openhands/cli-mode.mdx
@@ -1,5 +1,5 @@
---
-title: CLI
+title: CLI (V1)
description: The Command-Line Interface (CLI) provides a powerful interface that lets you engage with OpenHands
directly from your terminal.
---
@@ -7,6 +7,10 @@ description: The Command-Line Interface (CLI) provides a powerful interface that
This mode is different from the [headless mode](/openhands/usage/run-openhands/headless-mode), which is non-interactive and better
for scripting.
+
+If you're upgrading from a CLI version before release 1.X.X, you'll need to redo your settings setup as the configuration format has changed.
+
+
VIDEO
+
-If you prefer to use pip:
+Requires Python 3.12+ and uv installed.
```bash
-# Install OpenHands
-pip install openhands-ai
+uvx --python 3.12 openhands
```
-Note that you'll still need `uv` installed for the default MCP servers to work properly.
-
@@ -56,8 +54,8 @@ Add the following to your shell configuration file (`.bashrc`, `.zshrc`, etc.):
```bash
# Add OpenHands aliases (recommended)
-alias openhands="uvx --python 3.12 --from openhands-ai openhands"
-alias oh="uvx --python 3.12 --from openhands-ai openhands"
+alias openhands="uvx --python 3.12 openhands"
+alias oh="uvx --python 3.12 openhands"
```
After adding these lines, reload your shell configuration with `source ~/.bashrc` or `source ~/.zshrc` (depending on your shell).
@@ -74,7 +72,7 @@ cd ~
uv venv .openhands-venv --python 3.12
# Install OpenHands in the virtual environment
-uv pip install -t ~/.openhands-venv/lib/python3.12/site-packages openhands-ai
+uv pip install -t ~/.openhands-venv/lib/python3.12/site-packages openhands
# Add the bin directory to your PATH in your shell configuration file
echo 'export PATH="$PATH:$HOME/.openhands-venv/bin"' >> ~/.bashrc # or ~/.zshrc
@@ -87,28 +85,24 @@ source ~/.bashrc # or source ~/.zshrc
-
- If you have cloned the repository, you can also run the CLI directly using Poetry:
-
- poetry run openhands
-
-
3. Set your model, API key, and other preferences using the UI (or alternatively environment variables, below).
This command opens an interactive prompt where you can type tasks or commands and get responses from OpenHands.
The first time you run the CLI, it will take you through configuring the required LLM
settings. These will be saved for future sessions.
-The conversation history will be saved in `~/.openhands/sessions`.
+The conversation history will be saved in `~/.openhands/conversations`.
### Running with Docker
-1. Set the following environment variables in your terminal:
+1. Set the following environment variable in your terminal:
- `SANDBOX_VOLUMES` to specify the directory you want OpenHands to access ([See using SANDBOX_VOLUMES for more info](/openhands/usage/runtimes/docker#using-sandbox_volumes))
- - `LLM_MODEL` - the LLM model to use (e.g. `export LLM_MODEL="anthropic/claude-sonnet-4-20250514"` or `export LLM_MODEL="anthropic/claude-sonnet-4-5-20250929"`)
- - `LLM_API_KEY` - your API key (e.g. `export LLM_API_KEY="sk_test_12345"`)
-2. Run the following command:
+2. Ensure you have configured your settings before starting:
+ - Set up `~/.openhands/settings.json` with your LLM configuration
+ - Optionally configure `~/.openhands/mcp.json` for MCP servers
+
+3. Run the following command:
```bash
docker run -it \
@@ -116,21 +110,14 @@ docker run -it \
-e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.59-nikolaik \
-e SANDBOX_USER_ID=$(id -u) \
-e SANDBOX_VOLUMES=$SANDBOX_VOLUMES \
- -e LLM_API_KEY=$LLM_API_KEY \
- -e LLM_MODEL=$LLM_MODEL \
-v /var/run/docker.sock:/var/run/docker.sock \
- -v ~/.openhands:/.openhands \
+ -v ~/.openhands:/root/.openhands \
--add-host host.docker.internal:host-gateway \
- --name openhands-app-$(date +%Y%m%d%H%M%S) \
- docker.all-hands.dev/all-hands-ai/openhands:0.59 \
- python -m openhands.cli.entry --override-cli-mode true
+ --name openhands-cli-$(date +%Y%m%d%H%M%S) \
+ python:3.12-slim \
+ bash -c "pip install uv && uvx --python 3.12 openhands"
```
-
- If you used OpenHands before version 0.44, you may want to run `mv ~/.openhands-state ~/.openhands` to migrate your
- conversation history to the new location.
-
-
This launches the CLI in Docker, allowing you to interact with OpenHands.
The `-e SANDBOX_USER_ID=$(id -u)` is passed to the Docker command to ensure the sandbox user matches the host user’s
@@ -158,12 +145,11 @@ You can use the following commands whenever the prompt (`>`) is displayed:
|--------------|----------------------------------------------------------------|
| `/help` | Show all available interactive commands and their descriptions |
| `/exit` | Exit the application |
-| `/init` | Initialize a new repository for agent exploration |
| `/status` | Show conversation details and usage metrics |
| `/new` | Start a new conversation |
| `/settings` | View and modify current LLM/agent settings |
| `/resume` | Resume the agent if paused |
-| `/mcp` | Manage MCP server configuration and view connection errors |
+| `/mcp` | View active MCP servers and pending configuration changes |
#### Settings and Configuration
@@ -173,12 +159,7 @@ follow the prompts:
- **Basic settings**: Choose a model/provider and enter your API key.
- **Advanced settings**: Set custom endpoints, enable or disable confirmation mode, and configure memory condensation.
-Settings can also be managed via the `config.toml` file in the current directory or `~/.openhands/config.toml`.
-
-#### Repository Initialization
-
-The `/init` command helps the agent understand your project by creating a `.openhands/microagents/repo.md` file with
-project details and structure. Use this when onboarding the agent to a new codebase.
+Settings can also be managed via the `~/.openhands/settings.json` file.
#### Agent Pause/Resume Feature
@@ -187,38 +168,18 @@ type `/resume` at the prompt.
#### MCP Server Management
-To configure Model Context Protocol (MCP) servers, you can refer to the documentation on [MCP servers](/openhands/usage/settings/mcp-settings) and use the `/mcp` command in the CLI. This command provides an interactive interface for managing Model Context Protocol (MCP) servers:
-
-- **List configured servers**: View all currently configured MCP servers (SSE, Stdio, and SHTTP)
-- **Add new server**: Interactively add a new MCP server with guided prompts
-- **Remove server**: Remove an existing MCP server from your configuration
-- **View errors**: Display any connection errors that occurred during MCP server startup
-
-This command modifies your `~/.openhands/config.toml` file and will prompt you to restart OpenHands for changes to take effect.
-
-By default, the [Fetch MCP server](https://github.com/modelcontextprotocol/servers/tree/main/src/fetch) will be automatically configured for OpenHands. You can also [enable search engine](/openhands/usage/advanced/search-engine-setup) via the [Tavily MCP server](https://github.com/tavily-ai/tavily-mcp) by setting the `search_api_key` under the `[core]` section in the `~/.openhands/config.toml` file.
-
-##### Example of the `config.toml` file with MCP server configuration:
+
+If you're upgrading from a version before release 1.X.X, you'll need to redo your MCP server configuration as the format has changed from TOML to JSON.
+
-```toml
-[core]
-search_api_key = "tvly-your-api-key-here"
+To configure Model Context Protocol (MCP) servers, you need to manually create and configure the `~/.openhands/mcp.json` file following the configuration format outlined at [https://gofastmcp.com/clients/client#configuration-format](https://gofastmcp.com/clients/client#configuration-format).
-[mcp]
-stdio_servers = [
- {name="fetch", command="uvx", args=["mcp-server-fetch"]},
-]
+The `/mcp` command in the CLI provides a read-only view of MCP server status:
-sse_servers = [
- # Basic SSE server with just a URL
- "http://example.com:8080/sse",
-]
+- **View active servers**: Shows which MCP servers are currently active in the conversation
+- **View pending changes**: If the mcp.json file has been modified, shows which servers will be mounted when the conversation is restarted
-shttp_servers = [
- # Streamable HTTP server with API key authentication
- {url="https://secure-example.com/mcp", api_key="your-api-key"}
-]
-```
+By default, the [Fetch MCP server](https://github.com/modelcontextprotocol/servers/tree/main/src/fetch) will be automatically configured for OpenHands.
## Tips and Troubleshooting
diff --git a/openhands/usage/run-openhands/gui-mode.mdx b/openhands/usage/run-openhands/gui-mode.mdx
index cbd4058a..f248e6a9 100644
--- a/openhands/usage/run-openhands/gui-mode.mdx
+++ b/openhands/usage/run-openhands/gui-mode.mdx
@@ -15,7 +15,7 @@ You can launch the OpenHands GUI server directly from the command line using the
**Prerequisites**: You need to have the [OpenHands CLI installed](/usage/run-openhands/cli-mode) first, OR have `uv`
-installed and run `uvx --python 3.12 --from openhands-ai openhands serve`. Otherwise, you'll need to use Docker
+installed and run `uvx --python 3.12 openhands serve`. Otherwise, you'll need to use Docker
directly (see the [Docker section](#using-docker-directly) below).
diff --git a/openhands/usage/run-openhands/local-setup.mdx b/openhands/usage/run-openhands/local-setup.mdx
index 3ae6c5cc..b665184b 100644
--- a/openhands/usage/run-openhands/local-setup.mdx
+++ b/openhands/usage/run-openhands/local-setup.mdx
@@ -84,13 +84,13 @@ See the [uv installation guide](https://docs.astral.sh/uv/getting-started/instal
**Launch OpenHands**:
```bash
# Launch the GUI server
-uvx --python 3.12 --from openhands-ai openhands serve
+uvx --python 3.12 openhands serve
# Or with GPU support (requires nvidia-docker)
-uvx --python 3.12 --from openhands-ai openhands serve --gpu
+uvx --python 3.12 openhands serve --gpu
# Or with current directory mounted
-uvx --python 3.12 --from openhands-ai openhands serve --mount-cwd
+uvx --python 3.12 openhands serve --mount-cwd
```
This will automatically handle Docker requirements checking, image pulling, and launching the GUI server. The `--gpu` flag enables GPU support via nvidia-docker, and `--mount-cwd` mounts your current directory into the container.
@@ -101,7 +101,7 @@ If you prefer to use pip and have Python 3.12+ installed:
```bash
# Install OpenHands
-pip install openhands-ai
+pip install openhands
# Launch the GUI server
openhands serve
diff --git a/openhands/usage/windows-without-wsl.mdx b/openhands/usage/windows-without-wsl.mdx
index f09d874a..a1bad745 100644
--- a/openhands/usage/windows-without-wsl.mdx
+++ b/openhands/usage/windows-without-wsl.mdx
@@ -162,7 +162,7 @@ After installation, restart your PowerShell session to ensure the environment va
After installing the prerequisites, you can install and run OpenHands with:
```powershell
-uvx --python 3.12 --from openhands-ai openhands
+uvx --python 3.12 openhands
```
### Troubleshooting CLI Issues