Skip to content

feat(provider/openai): allow configuring base URL via OPENAI_BASE_URL#8990

Merged
gr2m merged 6 commits intovercel:mainfrom
Diluka:main
Sep 30, 2025
Merged

feat(provider/openai): allow configuring base URL via OPENAI_BASE_URL#8990
gr2m merged 6 commits intovercel:mainfrom
Diluka:main

Conversation

@Diluka
Copy link
Copy Markdown
Contributor

@Diluka Diluka commented Sep 29, 2025

Background

Some deployments need to target an OpenAI-compatible endpoint (e.g. self-hosted proxy, regional gateway). The OpenAI provider currently hardcodes the API base URL, which makes this inconvenient.

Summary

  • Add support for configuring the API base URL via the OPENAI_BASE_URL environment variable and the baseURL option on the provider.
  • Continue to default to https://api.openai.com/v1 when not provided.
  • Applies consistently across chat, completion, responses, embeddings, image, transcription, and speech models.

Manual Verification

  • Set OPENAI_BASE_URL to a custom endpoint (e.g. https://api.openai-proxy.example/v1).
  • Initialize via createOpenAI() and call openai.responses(/* modelId */); requests target the configured host.
  • Unset the env var and omit baseURL; requests fall back to https://api.openai.com/v1.
  • Also verified that an explicit baseURL option takes precedence over the env var.

Checklist

  • Tests have been added / updated (for bug fixes / features)
  • Documentation has been added / updated (for bug fixes / features)
  • A patch changeset for relevant packages has been added (for bug fixes / features - run pnpm changeset in the project root)
  • Formatting issues have been fixed (run pnpm prettier-fix in the project root)
  • I have reviewed this pull request (self-review)

Future Work

  • Consider documenting OPENAI_BASE_URL alongside other provider configuration in the docs.

Related Issues

Fixes #8564

@gr2m
Copy link
Copy Markdown
Collaborator

gr2m commented Sep 29, 2025

I confirmed that the official OpenAI SDK supports OPENAI_BASE_URL
https://github.com/search?q=repo%3Aopenai%2Fopenai-node%20OPENAI_BASE_URL&type=code

But it's not documented on their website:
https://www.google.com/search?q=site%3Ahttps%3A%2F%2Fplatform.openai.com%2Fdocs+%22OPENAI_BASE_URL%22

We will add support for it but not document it for now, following OpenAI's example

@gr2m gr2m added the backport Admins only: add this label to a pull request in order to backport it to the prior version label Sep 29, 2025
Copy link
Copy Markdown
Collaborator

@gr2m gr2m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you please follow the checklist of the pull request template? We can move ahead with the change

@gr2m gr2m added feature New feature or request provider/openai Issues related to the @ai-sdk/openai provider labels Sep 29, 2025
@Diluka Diluka requested a review from gr2m September 30, 2025 01:38
Copy link
Copy Markdown
Collaborator

@gr2m gr2m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @Diluka, congratulations for your first contribution to the AI SDK 💐

@gr2m gr2m merged commit 9a51b92 into vercel:main Sep 30, 2025
10 of 11 checks passed
vercel-ai-sdk bot pushed a commit that referenced this pull request Sep 30, 2025
…#8990)

## Background

Some deployments need to target an OpenAI-compatible endpoint (e.g.
self-hosted proxy, regional gateway). The OpenAI provider currently
hardcodes the API base URL, which makes this inconvenient.

## Summary

- Add support for configuring the API base URL via the `OPENAI_BASE_URL`
environment variable and the `baseURL` option on the provider.
- Continue to default to `https://api.openai.com/v1` when not provided.
- Applies consistently across chat, completion, responses, embeddings,
image, transcription, and speech models.

## Manual Verification

- Set `OPENAI_BASE_URL` to a custom endpoint (e.g.
`https://api.openai-proxy.example/v1`).
- Initialize via `createOpenAI()` and call `openai.responses(/* modelId
*/)`; requests target the configured host.
- Unset the env var and omit `baseURL`; requests fall back to
`https://api.openai.com/v1`.
- Also verified that an explicit `baseURL` option takes precedence over
the env var.

## Related Issues

Fixes #8564

---------

Co-authored-by: Gregor Martynus <39992+gr2m@users.noreply.github.com>
@vercel-ai-sdk
Copy link
Copy Markdown
Contributor

vercel-ai-sdk bot commented Sep 30, 2025

✅ Backport PR created: #9052

@vercel-ai-sdk vercel-ai-sdk bot removed the backport Admins only: add this label to a pull request in order to backport it to the prior version label Sep 30, 2025
gr2m added a commit that referenced this pull request Sep 30, 2025
…I_BASE_URL (#9052)

This is an automated backport of #8990 to the release-v5.0 branch.

Co-authored-by: Diluka <pmars42@gmail.com>
Co-authored-by: Gregor Martynus <39992+gr2m@users.noreply.github.com>
gr2m pushed a commit that referenced this pull request Oct 21, 2025
vercel-ai-sdk bot pushed a commit that referenced this pull request Oct 21, 2025
@gr2m gr2m added the ai/provider related to a provider package. Must be assigned together with at least one `provider/*` label label Oct 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ai/provider related to a provider package. Must be assigned together with at least one `provider/*` label feature New feature or request provider/openai Issues related to the @ai-sdk/openai provider

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support for OPENAI_BASE_URL

2 participants