Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
18 commits
Select commit Hold shift + click to select a range
8f8b70f
Update package name from openhands-ai to openhands
openhands-agent Oct 7, 2025
0b3aae0
Apply suggestion from @malhotra5
malhotra5 Oct 7, 2025
fa3ec1b
Make executable binary the primary CLI installation method
openhands-agent Oct 7, 2025
4a12b92
Update CLI page title to 'CLI (V1)' and add migration note
openhands-agent Oct 7, 2025
837045b
Remove /init command documentation
openhands-agent Oct 7, 2025
b586d60
Update configuration file references to use JSON format
openhands-agent Oct 7, 2025
c1cd182
Update MCP command documentation to reflect read-only functionality
openhands-agent Oct 7, 2025
64dddb8
Add MCP migration note for pre-1.X.X users
openhands-agent Oct 7, 2025
33b2bb9
Remove Poetry installation note
openhands-agent Oct 7, 2025
c6171f9
Remove outdated migration note for version 0.44
openhands-agent Oct 7, 2025
0ca3032
Update conversation storage location in Getting Started section
openhands-agent Oct 7, 2025
1bc7a10
Update Docker CLI command to use separate CLI installation
openhands-agent Oct 7, 2025
873af0c
Add configuration setup requirement for Docker CLI
openhands-agent Oct 7, 2025
7ed9f9a
Apply suggestion from @malhotra5
malhotra5 Oct 7, 2025
55f98d4
Remove pip install option from CLI mode docs
openhands-agent Oct 7, 2025
dc61d3d
Merge branch 'main' into update-package-name-to-openhands
malhotra5 Oct 10, 2025
96d0733
Apply suggestion from @malhotra5
malhotra5 Oct 10, 2025
b7bfe8d
Merge branch 'main' into update-package-name-to-openhands
malhotra5 Oct 10, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
119 changes: 40 additions & 79 deletions openhands/usage/run-openhands/cli-mode.mdx
Original file line number Diff line number Diff line change
@@ -1,12 +1,16 @@
---
title: CLI
title: CLI (V1)
description: The Command-Line Interface (CLI) provides a powerful interface that lets you engage with OpenHands
directly from your terminal.
---

This mode is different from the [headless mode](/openhands/usage/run-openhands/headless-mode), which is non-interactive and better
for scripting.

<Note>
If you're upgrading from a CLI version before release 1.X.X, you'll need to redo your settings setup as the configuration format has changed.
</Note>

<iframe
className="w-full aspect-video"
src="https://www.youtube.com/embed/PfvIx4y8h7w"
Expand All @@ -18,36 +22,30 @@ for scripting.

## Getting Started

### Running with Python

**Note** - OpenHands requires Python version 3.12 or higher (Python 3.14 is not currently supported) and `uv` for the default `fetch` MCP server (more details below).
### Using Executable Binary (Recommended)

#### Recommended: Using uv
1. **Download the executable binary** from [OpenHands release page](https://github.com/All-Hands-AI/OpenHands/releases/tag/1.0.0-cli)

We recommend using [uv](https://docs.astral.sh/uv/) for the best OpenHands experience. uv provides better isolation from your current project's virtual environment and is required for OpenHands' default MCP servers.

1. **Install uv** (if you haven't already):

See the [uv installation guide](https://docs.astral.sh/uv/getting-started/installation/) for the latest installation instructions for your platform.
2. **Make it executable**:
```bash
chmod +x ./openhands
```

2. **Launch OpenHands CLI**:
3. **Run the executable**:
```bash
uvx --python 3.12 --from openhands-ai openhands
./openhands
```

<AccordionGroup>

<Accordion title="Alternative: Traditional pip installation">
<Accordion title="Alternative: Using uv">

If you prefer to use pip:
Requires Python 3.12+ and uv installed.

```bash
# Install OpenHands
pip install openhands-ai
uvx --python 3.12 openhands
```

Note that you'll still need `uv` installed for the default MCP servers to work properly.

</Accordion>

<Accordion title="Create shell aliases for easy access across environments">
Expand All @@ -56,8 +54,8 @@ Add the following to your shell configuration file (`.bashrc`, `.zshrc`, etc.):

```bash
# Add OpenHands aliases (recommended)
alias openhands="uvx --python 3.12 --from openhands-ai openhands"
alias oh="uvx --python 3.12 --from openhands-ai openhands"
alias openhands="uvx --python 3.12 openhands"
alias oh="uvx --python 3.12 openhands"
```

After adding these lines, reload your shell configuration with `source ~/.bashrc` or `source ~/.zshrc` (depending on your shell).
Expand All @@ -74,7 +72,7 @@ cd ~
uv venv .openhands-venv --python 3.12

# Install OpenHands in the virtual environment
uv pip install -t ~/.openhands-venv/lib/python3.12/site-packages openhands-ai
uv pip install -t ~/.openhands-venv/lib/python3.12/site-packages openhands

# Add the bin directory to your PATH in your shell configuration file
echo 'export PATH="$PATH:$HOME/.openhands-venv/bin"' >> ~/.bashrc # or ~/.zshrc
Expand All @@ -87,50 +85,39 @@ source ~/.bashrc # or source ~/.zshrc

</AccordionGroup>

<Note>
If you have cloned the repository, you can also run the CLI directly using Poetry:

poetry run openhands
</Note>

3. Set your model, API key, and other preferences using the UI (or alternatively environment variables, below).

This command opens an interactive prompt where you can type tasks or commands and get responses from OpenHands.
The first time you run the CLI, it will take you through configuring the required LLM
settings. These will be saved for future sessions.

The conversation history will be saved in `~/.openhands/sessions`.
The conversation history will be saved in `~/.openhands/conversations`.

### Running with Docker

1. Set the following environment variables in your terminal:
1. Set the following environment variable in your terminal:
- `SANDBOX_VOLUMES` to specify the directory you want OpenHands to access ([See using SANDBOX_VOLUMES for more info](/openhands/usage/runtimes/docker#using-sandbox_volumes))
- `LLM_MODEL` - the LLM model to use (e.g. `export LLM_MODEL="anthropic/claude-sonnet-4-20250514"` or `export LLM_MODEL="anthropic/claude-sonnet-4-5-20250929"`)
- `LLM_API_KEY` - your API key (e.g. `export LLM_API_KEY="sk_test_12345"`)

2. Run the following command:
2. Ensure you have configured your settings before starting:
- Set up `~/.openhands/settings.json` with your LLM configuration
- Optionally configure `~/.openhands/mcp.json` for MCP servers

3. Run the following command:

```bash
docker run -it \
--pull=always \
-e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.59-nikolaik \
-e SANDBOX_USER_ID=$(id -u) \
-e SANDBOX_VOLUMES=$SANDBOX_VOLUMES \
-e LLM_API_KEY=$LLM_API_KEY \
-e LLM_MODEL=$LLM_MODEL \
-v /var/run/docker.sock:/var/run/docker.sock \
-v ~/.openhands:/.openhands \
-v ~/.openhands:/root/.openhands \
--add-host host.docker.internal:host-gateway \
--name openhands-app-$(date +%Y%m%d%H%M%S) \
docker.all-hands.dev/all-hands-ai/openhands:0.59 \
python -m openhands.cli.entry --override-cli-mode true
--name openhands-cli-$(date +%Y%m%d%H%M%S) \
python:3.12-slim \
bash -c "pip install uv && uvx --python 3.12 openhands"
```

<Note>
If you used OpenHands before version 0.44, you may want to run `mv ~/.openhands-state ~/.openhands` to migrate your
conversation history to the new location.
</Note>

This launches the CLI in Docker, allowing you to interact with OpenHands.

The `-e SANDBOX_USER_ID=$(id -u)` is passed to the Docker command to ensure the sandbox user matches the host user’s
Expand Down Expand Up @@ -158,12 +145,11 @@ You can use the following commands whenever the prompt (`>`) is displayed:
|--------------|----------------------------------------------------------------|
| `/help` | Show all available interactive commands and their descriptions |
| `/exit` | Exit the application |
| `/init` | Initialize a new repository for agent exploration |
| `/status` | Show conversation details and usage metrics |
| `/new` | Start a new conversation |
| `/settings` | View and modify current LLM/agent settings |
| `/resume` | Resume the agent if paused |
| `/mcp` | Manage MCP server configuration and view connection errors |
| `/mcp` | View active MCP servers and pending configuration changes |

#### Settings and Configuration

Expand All @@ -173,12 +159,7 @@ follow the prompts:
- **Basic settings**: Choose a model/provider and enter your API key.
- **Advanced settings**: Set custom endpoints, enable or disable confirmation mode, and configure memory condensation.

Settings can also be managed via the `config.toml` file in the current directory or `~/.openhands/config.toml`.

#### Repository Initialization

The `/init` command helps the agent understand your project by creating a `.openhands/microagents/repo.md` file with
project details and structure. Use this when onboarding the agent to a new codebase.
Settings can also be managed via the `~/.openhands/settings.json` file.

#### Agent Pause/Resume Feature

Expand All @@ -187,38 +168,18 @@ type `/resume` at the prompt.

#### MCP Server Management

To configure Model Context Protocol (MCP) servers, you can refer to the documentation on [MCP servers](/openhands/usage/settings/mcp-settings) and use the `/mcp` command in the CLI. This command provides an interactive interface for managing Model Context Protocol (MCP) servers:

- **List configured servers**: View all currently configured MCP servers (SSE, Stdio, and SHTTP)
- **Add new server**: Interactively add a new MCP server with guided prompts
- **Remove server**: Remove an existing MCP server from your configuration
- **View errors**: Display any connection errors that occurred during MCP server startup

This command modifies your `~/.openhands/config.toml` file and will prompt you to restart OpenHands for changes to take effect.

By default, the [Fetch MCP server](https://github.com/modelcontextprotocol/servers/tree/main/src/fetch) will be automatically configured for OpenHands. You can also [enable search engine](/openhands/usage/advanced/search-engine-setup) via the [Tavily MCP server](https://github.com/tavily-ai/tavily-mcp) by setting the `search_api_key` under the `[core]` section in the `~/.openhands/config.toml` file.

##### Example of the `config.toml` file with MCP server configuration:
<Note>
If you're upgrading from a version before release 1.X.X, you'll need to redo your MCP server configuration as the format has changed from TOML to JSON.
</Note>

```toml
[core]
search_api_key = "tvly-your-api-key-here"
To configure Model Context Protocol (MCP) servers, you need to manually create and configure the `~/.openhands/mcp.json` file following the configuration format outlined at [https://gofastmcp.com/clients/client#configuration-format](https://gofastmcp.com/clients/client#configuration-format).

[mcp]
stdio_servers = [
{name="fetch", command="uvx", args=["mcp-server-fetch"]},
]
The `/mcp` command in the CLI provides a read-only view of MCP server status:

sse_servers = [
# Basic SSE server with just a URL
"http://example.com:8080/sse",
]
- **View active servers**: Shows which MCP servers are currently active in the conversation
- **View pending changes**: If the mcp.json file has been modified, shows which servers will be mounted when the conversation is restarted

shttp_servers = [
# Streamable HTTP server with API key authentication
{url="https://secure-example.com/mcp", api_key="your-api-key"}
]
```
By default, the [Fetch MCP server](https://github.com/modelcontextprotocol/servers/tree/main/src/fetch) will be automatically configured for OpenHands.

## Tips and Troubleshooting

Expand Down
2 changes: 1 addition & 1 deletion openhands/usage/run-openhands/gui-mode.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ You can launch the OpenHands GUI server directly from the command line using the

<Info>
**Prerequisites**: You need to have the [OpenHands CLI installed](/usage/run-openhands/cli-mode) first, OR have `uv`
installed and run `uvx --python 3.12 --from openhands-ai openhands serve`. Otherwise, you'll need to use Docker
installed and run `uvx --python 3.12 openhands serve`. Otherwise, you'll need to use Docker
directly (see the [Docker section](#using-docker-directly) below).
</Info>

Expand Down
8 changes: 4 additions & 4 deletions openhands/usage/run-openhands/local-setup.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -84,13 +84,13 @@ See the [uv installation guide](https://docs.astral.sh/uv/getting-started/instal
**Launch OpenHands**:
```bash
# Launch the GUI server
uvx --python 3.12 --from openhands-ai openhands serve
uvx --python 3.12 openhands serve

# Or with GPU support (requires nvidia-docker)
uvx --python 3.12 --from openhands-ai openhands serve --gpu
uvx --python 3.12 openhands serve --gpu

# Or with current directory mounted
uvx --python 3.12 --from openhands-ai openhands serve --mount-cwd
uvx --python 3.12 openhands serve --mount-cwd
```

This will automatically handle Docker requirements checking, image pulling, and launching the GUI server. The `--gpu` flag enables GPU support via nvidia-docker, and `--mount-cwd` mounts your current directory into the container.
Expand All @@ -101,7 +101,7 @@ If you prefer to use pip and have Python 3.12+ installed:

```bash
# Install OpenHands
pip install openhands-ai
pip install openhands

# Launch the GUI server
openhands serve
Expand Down
2 changes: 1 addition & 1 deletion openhands/usage/windows-without-wsl.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@ After installation, restart your PowerShell session to ensure the environment va
After installing the prerequisites, you can install and run OpenHands with:

```powershell
uvx --python 3.12 --from openhands-ai openhands
uvx --python 3.12 openhands
```

### Troubleshooting CLI Issues
Expand Down