Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When I set the value of AZURE_OPENAI_BASE_PATH, a 404 error is reported. #3573

Closed
lileltp opened this issue Dec 7, 2023 · 3 comments
Closed
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@lileltp
Copy link

lileltp commented Dec 7, 2023

Issue Description

I encountered the following issue when using the AZURE_OPENAI_BASE_PATH parameter in langchain.js:
image

Steps to Reproduce

azureOpenAIBasePath: "https://oai.hconeai.com",

Related Information

Attachments

[llm/error] [1:llm:ChatOpenAI] [1.66s] LLM run errored with error: "404 Resource not found"

/Users/henry/Projects/sxrhea/new-lerna-workspace/node_modules/openai/src/error.ts:78
return new NotFoundError(status, error, message, headers);

Thank you for your assistance!

Copy link

dosubot bot commented Dec 7, 2023

🤖

Hello @lileltp,

I'm Dosu, a friendly bot here to assist you with your bugs, answer questions, and guide you in becoming a contributor to this project while we wait for a human maintainer. Nice to meet you! Please feel free to reach out if you need any help.

Based on the information you've provided, it seems like the 404 error you're encountering is likely due to the URL you're using for the AZURE_OPENAI_BASE_PATH parameter not being correct or the endpoint not being available.

In the langchainjs framework, the AZURE_OPENAI_BASE_PATH is used to construct the endpoint for the OpenAI API. If the base path is incorrect or the endpoint is not available, you will receive a 404 error.

The getEndpoint function in the openai.ts file is responsible for constructing the endpoint:

const openAIEndpointConfig: OpenAIEndpointConfig = {
  azureOpenAIApiDeploymentName: this.azureOpenAIApiDeploymentName,
  azureOpenAIApiInstanceName: this.azureOpenAIApiInstanceName,
  azureOpenAIApiKey: this.azureOpenAIApiKey,
  azureOpenAIBasePath: this.azureOpenAIBasePath,
  baseURL: this.clientConfig.baseURL,
};

const endpoint = getEndpoint(openAIEndpointConfig);

Please ensure that the URL "https://oai.hconeai.com" is correct and the endpoint is available. If the issue persists, you may need to check your Azure OpenAI API instance name and deployment name, as these are also used to construct the endpoint. You can find more details in the openai.ts file.

Additionally, I found a similar solved issue in the langchainjs repository: Feature Request: Add ability to set openAIApiBase on the OpenAI call. The suggested solution is to set the URL to a private OpenAI compatible server in the JavaScript version of LangChain as follows:

import { OpenAI } from "langchain/llms/openai"
const model = new OpenAI(
      {
          temperature: 0,
          openAIApiKey: 'EMPTY',
          modelName: 'EMPTY'
      },
      {
          basePath: 'http://localhost:8000/v1'
      }
)

I hope this helps! If you have any other questions or need further clarification, feel free to ask.

Best,
Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Dec 7, 2023
@lileltp
Copy link
Author

lileltp commented Dec 7, 2023

I am using the example provided in the documentation at https://docs.helicone.ai/getting-started/integration-method/azure

const model = new ChatOpenAI({
  azureOpenAIApiKey: "[AZURE_OPENAI_API_KEY]",
  azureOpenAIApiDeploymentName: "openai/deployments/gpt-35-turbo",
  azureOpenAIApiVersion: "2023-03-15-preview",
  azureOpenAIBasePath: "https://oai.hconeai.com",
  configuration: {
    organization: "[organization]",
    baseOptions: {
      headers: {
        "Helicone-Auth": `Bearer ${heliconeApiKey}`,
        "Helicone-OpenAI-Api-Base":
          "https://[YOUR_AZURE_DOMAIN].openai.azure.com",
      },
    },
  },
});

If I don't set azureOpenAIBasePath, the program runs correctly, but it is unable to send request logs to Helicone.
However, the program cannot run properly when configuring azureOpenAIBasePath and returns a 404 error. It's worth noting that there are records of the requests in Helicone. Additionally, I've observed that configuring the azureOpenAIBasePath value alone prevents the program from running correctly.

@h4r5h4
Copy link

h4r5h4 commented Dec 9, 2023

Your screenshot indicates you are passing flamingo-ai for azureOpenAIApiDeploymentName
make sure it's openai/deployments/flamingo-ai

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

2 participants