Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: add support for OPENAI_BASE_URL envar #717

Merged
merged 1 commit into from
Apr 28, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions site/docs/providers/openai.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ Supported parameters include:
| `apiKey` | Your OpenAI API key, equivalent to `OPENAI_API_KEY` environment variable |
| `apiKeyEnvar` | An environment variable that contains the API key |
| `apiHost` | The hostname of the OpenAI API, please also read `OPENAI_API_HOST` below. |
| `apiBaseUrl` | The base URL of the OpenAI API, please also read `OPENAI_API_BASE_URL` below. |
| `apiBaseUrl` | The base URL of the OpenAI API, please also read `OPENAI_BASE_URL` below. |
| `organization` | Your OpenAI organization key. |

Here are the type declarations of `config` parameters:
Expand Down Expand Up @@ -572,8 +572,8 @@ These OpenAI-related environment variables are supported:
| -------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------ |
| `OPENAI_TEMPERATURE` | Temperature model parameter, defaults to 0. |
| `OPENAI_MAX_TOKENS` | Max_tokens model parameter, defaults to 1024. |
| `OPENAI_API_HOST` | The hostname to use (useful if you're using an API proxy). Takes priority over `OPENAI_API_BASE_URL`. |
| `OPENAI_API_BASE_URL` | The base URL (protocol + hostname + port) to use, this is a more general option than `OPENAI_API_HOST`. |
| `OPENAI_API_HOST` | The hostname to use (useful if you're using an API proxy). Takes priority over `OPENAI_BASE_URL`. |
| `OPENAI_BASE_URL` | The base URL (protocol + hostname + port) to use, this is a more general option than `OPENAI_API_HOST`. |
| `OPENAI_API_KEY` | OpenAI API key. |
| `OPENAI_ORGANIZATION` | The OpenAI organization key to use. |
| `PROMPTFOO_REQUIRE_JSON_PROMPTS` | By default the chat completion provider will wrap non-JSON messages in a single user message. Setting this envar to true disables that behavior. |
Expand Down
2 changes: 1 addition & 1 deletion site/docs/providers/openllm.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ To use [OpenLLM](https://github.com/bentoml/OpenLLM) with promptfoo, we take adv

2. Set environment variables:

- Set `OPENAI_API_BASE_URL` to `http://localhost:8001/v1`
- Set `OPENAI_BASE_URL` to `http://localhost:8001/v1`
- Set `OPENAI_API_KEY` to a dummy value `foo`.

3. Depending on your use case, use the `chat` or `completion` model types.
Expand Down
2 changes: 1 addition & 1 deletion site/docs/providers/text-generation-webui.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,4 +27,4 @@ providers:
instruction_template: LLama-v2
```

If desired, you can instead use the `OPENAI_API_BASE_URL` and `OPENAI_API_KEY` environment variables instead of the `apiBaseUrl` and `apiKey` configs.
If desired, you can instead use the `OPENAI_BASE_URL` and `OPENAI_API_KEY` environment variables instead of the `apiBaseUrl` and `apiKey` configs.
2 changes: 1 addition & 1 deletion site/docs/providers/togetherai.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,6 @@ providers:
apiKeyEnvar: TOGETHER_API_KEY
```

If desired, you can instead use the `OPENAI_API_BASE_URL` environment variables instead of the `apiBaseUrl` config property.
If desired, you can instead use the `OPENAI_BASE_URL` environment variables instead of the `apiBaseUrl` config property.

In this example, you'd also have to set the `TOGETHER_API_KEY` environment variable (you can also enter it directly in the config using the `apiKey` property).
2 changes: 1 addition & 1 deletion site/docs/providers/vllm.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,4 @@ providers:
apiBaseUrl: http://localhost:8080/v1
```

If desired, you can instead use the `OPENAI_API_BASE_URL` environment variable instead of the `apiBaseUrl` config.
If desired, you can instead use the `OPENAI_BASE_URL` environment variable instead of the `apiBaseUrl` config.
13 changes: 11 additions & 2 deletions src/providers/azureopenai.ts
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,11 @@ class AzureOpenAiGenericProvider implements ApiProvider {
this.apiHost =
config?.apiHost || env?.AZURE_OPENAI_API_HOST || process.env.AZURE_OPENAI_API_HOST;
this.apiBaseUrl =
config?.apiBaseUrl || env?.AZURE_OPENAI_API_BASE_URL || process.env.AZURE_OPENAI_API_BASE_URL;
config?.apiBaseUrl ||
env?.AZURE_OPENAI_API_BASE_URL ||
env?.AZURE_OPENAI_BASE_URL ||
process.env.AZURE_OPENAI_API_BASE_URL ||
process.env.AZURE_OPENAI_BASE_URL;

this.config = config || {};
this.id = id ? () => id : this.id;
Expand Down Expand Up @@ -412,7 +416,12 @@ export class AzureOpenAiChatCompletionProvider extends AzureOpenAiGenericProvide
choice.message.role === 'assistant',
)?.message
: data.choices[0].message;
const output = message.content == null ? message.tool_calls == null ? message.function_call : message.tool_calls : message.content;
const output =
message.content == null
? message.tool_calls == null
? message.function_call
: message.tool_calls
: message.content;
const logProbs = data.choices[0].logprobs?.content?.map(
(logProbObj: { token: string; logprob: number }) => logProbObj.logprob,
);
Expand Down
2 changes: 2 additions & 0 deletions src/providers/openai.ts
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,9 @@ export class OpenAiGenericProvider implements ApiProvider {
return (
this.config.apiBaseUrl ||
this.env?.OPENAI_API_BASE_URL ||
this.env?.OPENAI_BASE_URL ||
process.env.OPENAI_API_BASE_URL ||
process.env.OPENAI_BASE_URL ||
this.getApiUrlDefault()
);
}
Expand Down
2 changes: 2 additions & 0 deletions src/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -41,11 +41,13 @@ export interface EnvOverrides {
AZURE_OPENAI_API_HOST?: string;
AZURE_OPENAI_API_KEY?: string;
AZURE_OPENAI_API_BASE_URL?: string;
AZURE_OPENAI_BASE_URL?: string;
AWS_BEDROCK_REGION?: string;
COHERE_API_KEY?: string;
OPENAI_API_KEY?: string;
OPENAI_API_HOST?: string;
OPENAI_API_BASE_URL?: string;
OPENAI_BASE_URL?: string;
OPENAI_ORGANIZATION?: string;
REPLICATE_API_KEY?: string;
REPLICATE_API_TOKEN?: string;
Expand Down
Loading