Skip to content

Conversation

@vbs-0
Copy link

@vbs-0 vbs-0 commented May 14, 2025

Description

This PR improves the Azure OpenAI integration documentation to address issue #8412. The documentation was unclear about environment variables and authentication methods, especially for Azure AD token authentication in headless mode.

Changes Made

  • Added clear separation between API Key and Azure AD Token authentication methods
  • Documented all relevant environment variables with examples
  • Added detailed examples for both authentication methods
  • Improved UI configuration instructions
  • Added a troubleshooting section
  • Fixed formatting and organization

Changelog Message

Added comprehensive documentation for Azure OpenAI integration, including both API key and AD token authentication methods, with clear examples and troubleshooting guide.

Type of Change

  • Documentation update
  • Bug fix
  • New feature
  • Breaking change
  • Other (please describe):

Testing Done

  • Verified all environment variables mentioned are correct
  • Confirmed documentation aligns with LiteLLM implementation
  • Checked formatting in both light and dark modes
  • Validated all links work correctly

Related Issues

Fixes #8412 (#8412)

Additional Notes

The documentation now includes both authentication methods with clear examples, which should help users avoid confusion about which environment variables to use.

vbs-0 added 4 commits May 13, 2025 08:47
- Add clear instructions for both API key and AD token authentication
- Clarify required environment variables
- Add troubleshooting section
- Fix formatting and organization
- Addresses issue OpenHands#8412
@vbs-0 vbs-0 requested a review from mamoodi as a code owner May 14, 2025 03:25

When running OpenHands, you'll need to set the following environment variable using `-e` in the
[docker run command](../installation#running-openhands):
OpenHands supports two authentication methods for Azure OpenAI:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
OpenHands supports two authentication methods for Azure OpenAI:
OpenHands supports the following authentication methods for Azure OpenAI:

[docker run command](../installation#running-openhands):
OpenHands supports two authentication methods for Azure OpenAI:

### 1. API Key Authentication
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
### 1. API Key Authentication
### API Key Authentication

```

Example:
### 2. Azure AD Token Authentication
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
### 2. Azure AD Token Authentication
### Azure AD Token Authentication

- `Custom Model` to azure/<deployment-name>
- `Base URL` to your Azure API Base URL (e.g. `https://example-endpoint.openai.azure.com`)
- `API Key` to your Azure API key
- `LLM Provider` to `Azure`
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- `LLM Provider` to `Azure`

```bash
# Required
LLM_API_VERSION="<api-version>" # e.g. "2024-02-15-preview"
LLM_BASE_URL="<azure-endpoint>" # e.g. "https://your-resource.openai.azure.com/"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If using the UI, you don't need to set:
LLM_BASE_URL
LLM_API_KEY
LLM_MODEL

because what you set in the settings in the UI will overwrite these.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why does the UI force a choice when these variables are set?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a good question. Been like that since I remember and there's a big github issue on that.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah I see enyst already linked it below.

-e LLM_BASE_URL="https://your-resource.openai.azure.com/" \
-e LLM_API_KEY="your-api-key" \
-e LLM_MODEL="azure/your-deployment-name" \
docker.all-hands.dev/all-hands-ai/openhands:latest
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment as above. LLM_BASE_URL, LLM_API_KEY, LLM_MODEL should not be needed.

Just replace docker.all-hands.dev/all-hands-ai/openhands:latest with ... to note that the rest of the docker command is the same as the original docker command to run OpenHands.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what should happen instead is the UI should not force you to re-enter the config if these are set

another bug?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that is this bug:

What happened here was that the UI was reading them only from local storage, not backend, a long time ago, and then we refactored that to read and write on the web server, but it still doesn't do it from the application configuration.

I think it might be better to document running with UI without these vars (that are overridden by the UI), and only document these vars when running CLI or headless, which read them and follow them.

LLM_API_VERSION="<api-version>" # e.g. "2024-02-15-preview"
LLM_BASE_URL="<azure-endpoint>" # e.g. "https://your-resource.openai.azure.com/"
AZURE_AD_TOKEN="<your-ad-token>" # Your Azure AD access token
LLM_MODEL="azure/<deployment-name>" # e.g. "azure/gpt-4" where gpt-4 is your deployment name
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above. LLM_BASE_URL and LLM_MODEL are not needed since they will be set in the UI

-e LLM_BASE_URL="https://your-resource.openai.azure.com/" \
-e AZURE_AD_TOKEN="your-ad-token" \
-e LLM_MODEL="azure/your-deployment-name" \
docker.all-hands.dev/all-hands-ai/openhands:latest
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comments as the above command

- `Base URL` to your Azure API Base URL (e.g. `https://example-endpoint.openai.azure.com`)
- `API Key` to your Azure API key
- `LLM Provider` to `Azure`
- `Advanced` options:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- `Advanced` options:

Copy link

@pcuci pcuci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not clear how to disable the UI modal when the LLM_* variables are set

would that be another variable somewhere to force env var pickup?

```bash
# Required
LLM_API_VERSION="<api-version>" # e.g. "2024-02-15-preview"
LLM_BASE_URL="<azure-endpoint>" # e.g. "https://your-resource.openai.azure.com/"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why does the UI force a choice when these variables are set?

-e LLM_BASE_URL="https://your-resource.openai.azure.com/" \
-e LLM_API_KEY="your-api-key" \
-e LLM_MODEL="azure/your-deployment-name" \
docker.all-hands.dev/all-hands-ai/openhands:latest
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what should happen instead is the UI should not force you to re-enter the config if these are set

another bug?

@github-actions
Copy link
Contributor

This PR is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@github-actions github-actions bot added the Stale Inactive for 40 days label Jun 16, 2025
# Required
LLM_API_VERSION="<api-version>" # e.g. "2024-02-15-preview"
LLM_BASE_URL="<azure-endpoint>" # e.g. "https://your-resource.openai.azure.com/"
AZURE_AD_TOKEN="<your-ad-token>" # Your Azure AD access token
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Documentation for the AD token use is welcome! I have some doubt here though, does this really work like this? I thought this token is refreshed periodically, in which case maybe it works when the user starts, but then it stops working.

I think litellm has AD token support. I’m just not sure that it works for us this way. Please feel free to correct me if wrong.

@github-actions github-actions bot removed the Stale Inactive for 40 days label Jun 17, 2025
@mamoodi
Copy link
Collaborator

mamoodi commented Jul 18, 2025

Hi there. This PR has been open for a while without addressing any of the comments. I'm going to close it. But feel free to let me know if I'm mistaken! And feel free to open a new PR once ready! Thank you.

@mamoodi mamoodi closed this Jul 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: Documentation for Azure OpenAi is incomplete?

4 participants