Skip to content

FIX (openai): respect custom API endpoint when creating AsyncOpenAI clients #192

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

toloco
Copy link

@toloco toloco commented Jun 10, 2025

Respect user-defined OpenAI API endpoints for both Embedding and LLM providers.

Config files defines endpoints here:

Calls now use the endpoint configured in CONFIG (or OPENAI_ENDPOINT) instead of the hard-coded https://api.openai.com.

  • NLWeb already exposes an endpoint field for embedding_provider("openai") and llm_endpoints["openai"], but the value was never forwarded to the AsyncOpenAI client.
  • Users relying on Azure OpenAI, local proxies, or gateway URLs had to monkey-patch client creation.

@toloco
Copy link
Author

toloco commented Jun 10, 2025

@toloco please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.

@microsoft-github-policy-service agree [company="{your company}"]

Options:

  • (default - no company specified) I have sole ownership of intellectual property rights to my Submissions and I am not making Submissions in the course of work for my employer.
@microsoft-github-policy-service agree
  • (when company given) I am making Submissions in the course of work for my employer (or my employer has intellectual property rights in my Submissions by contract or applicable law). I have permission from my employer to make Submissions and enter into this Agreement on behalf of my employer. By signing below, the defined term “You” includes me and my employer.
@microsoft-github-policy-service agree company="Microsoft"

Contributor License Agreement

@microsoft-github-policy-service agree

@linjieli222 linjieli222 self-assigned this Jun 17, 2025
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR ensures that custom OpenAI API endpoints from configuration or environment variables are properly utilized instead of a hard-coded endpoint.

  • Added a new method in the llm module to retrieve the API endpoint.
  • Updated the AsyncOpenAI client initialization in both llm and embedding modules to use the custom endpoint.
  • Adjusted the environment template to align with the expected API endpoint format.

Reviewed Changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 1 comment.

File Description
code/llm/openai.py Added get_api_endpoint and updated client initialization to respect custom endpoints.
code/embedding/openai_embedding.py Added get_openai_api_endpoint and updated client initialization accordingly.
code/.env.template Updated the OPENAI_ENDPOINT value to a more general base URL.
Comments suppressed due to low confidence (1)

code/embedding/openai_embedding.py:49

  • The docstring incorrectly mentions 'API key' instead of 'API endpoint'. Update the docstring to accurately reflect that this function retrieves the API endpoint.
    Retrieve the OpenAI API key from configuration.

Comment on lines +66 to +71
# Fallback to environment variable
endpoint = os.getenv("OPENAI_ENDPOINT")
if not endpoint:
error_msg = "OpenAI API endpoint not found in configuration or environment"
logger.error(error_msg)
raise ValueError(error_msg)
Copy link
Preview

Copilot AI Jun 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The configured endpoint from provider_config is overwritten unconditionally by the environment variable. Consider checking if the provider_config.endpoint is available before using os.getenv as a fallback.

Suggested change
# Fallback to environment variable
endpoint = os.getenv("OPENAI_ENDPOINT")
if not endpoint:
error_msg = "OpenAI API endpoint not found in configuration or environment"
logger.error(error_msg)
raise ValueError(error_msg)
# Fallback to environment variable if provider_config.endpoint is not set
if not endpoint:
endpoint = os.getenv("OPENAI_ENDPOINT")
if not endpoint:
error_msg = "OpenAI API endpoint not found in configuration or environment"
logger.error(error_msg)
raise ValueError(error_msg)

Copilot uses AI. Check for mistakes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants