-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Azure OpenAI provider #1675
Comments
I am also having trouble switching to the ai/rsc SDK (specifically was:
now:
|
Azure OpenAI is not fully compatible with OpenAI and will need a separate provider |
I was looking into this a little bit yesterday. I set the base url to my azure deployment, added the |
Looked into this some more, and there is some funkiness on the Azure API. Sending a single message with the But, once you add multiple messages you get a misleading 500 error: I was able to get chat streaming working by modifying the OpenAI provider to accept some Azure specific values and format user messages as |
I spent hours trying to get it to work with Azure Open AI, but unfortunately, I couldn't make it happen. I'll just have to wait for the official Azure Open AI provider now. |
I'm hoping to get something up later this week, and will update this thread when I do. Pretty sure I have everything working for text inputs with The other change I've needed so far is adjusting the |
we also need this because our company is Azure only 🥲 |
I'll have this up today for chat models and text input. Just trying to figure out some Github action stuff. |
https://github.com/patrickrmoore/azure-openai-provider This has had minimal testing and is very much use at your own risk at the moment |
@patrick-moore heads up that we will prob integrate azure-openai as first party in the repo |
Awesome, I think that makes the most sense. The changes were pretty minimal and my current package is not set up to make integrating upstream changes easy. Adding a few extension points to the existing OpenAI provider would solve that. I'll hold off on putting in a PR to list this as a community provider. Let me know if yall would like any help on this! |
@patrick-moore thanks so much for your work! unblocked me today |
@lgrammel is there a timeline for azure-openai first party provider? thanks |
+1 on the timeline! @lgrammel :) Goes without saying but stellar work in this SDK!! |
I'm blocked (do not have Azure OpenAI access) so I cannot provide a timeline |
I tried it immediately, but it seems there's a problem.
I've already upgraded |
https://github.com/vercel/ai/releases/tag/%40ai-sdk/azure%400.0.2 |
New error:
|
@wong2 what inputs / model are you using? |
@lgrammel GPT-4o, the input is just |
@wong2 can you give https://github.com/vercel/ai/releases/tag/%40ai-sdk/azure%400.0.3 a try when you get a chance? i was not able to reproduce the issue, but this should add more robustness to the stream chunk validation |
It works now. Thanks. |
@lgrammel awesome, giving this a spin now for import {
streamUI
} from 'ai/rsc';
import { createAzure } from '@ai-sdk/azure';
const azure = createAzure({
apiKey: process.env.AZURE_OPEN_AI_KEY,
resourceName: process.env.AZURE_OPEN_AI_DEPLOYMENT
})(process.env.AZURE_OPEN_AI_API_VERSION)
async function submitUserMessage(content: string) {
'use server'
const ui = await streamUI({
model: azure,
initial: <SpinnerMessage />,
...
})
...
} It looks like the interface here is only But I need to also set a
example: // ?
createAzure({
apiKey: process.env.AZURE_OPEN_AI_KEY,
resourceName: process.env.AZURE_OPEN_AI_DEPLOYMENT
})(process.env.AZURE_OPEN_AI_API_VERSION)
// working
createOpenAI({
apiKey: process.env.OPEN_AI_KEY,
baseURL: process.env.OPEN_AI_BASE_URL,
organization: 'org-xxxxx',
headers: { ... }
})('gpt-3.5-turbo') let me know how I can use this! |
You can use: import { createAzure } from '@ai-sdk/azure';
const azure = createAzure({
resourceName: 'your-resource-name', // Azure resource name
apiKey: 'your-api-key',
}); and then create a model: const model = azure('your-deployment-name'); here are the docs: https://sdk.vercel.ai/providers/ai-sdk-providers/azure and if it helps, here is how the url is assembled internally: https://github.com/vercel/ai/blob/main/packages/azure/src/azure-openai-provider.ts#L64 |
@lgrammel I got it, yea that looks right and populates the endpoint in https://github.com/vercel/ai/blob/main/packages/azure/src/azure-openai-provider.ts#L64.
Works great! |
I'm noticing that streaming with azure is clunky than the one with openai but maybe this is just an azure problem 🤔 |
@miguelvictor yes that's a know azure issue |
I'm also noticing that the same tool is being called twice (at the same time; resulting to |
@miguelvictor do you have a test case that i could use to reproduce this? |
Sorry, it also happened with the openai provider. Is it possible to only process one tool per streamUI call? |
@miguelvictor not yet but I plan to implement #1893 which should help |
Cool! Thank you so much! |
@miguelvictor it is automatically released when the openai package changes: https://github.com/vercel/ai/releases/tag/%40ai-sdk%2Fazure%400.0.4 (should include that setting) |
Sorry, bun, for some reason, didn't pickup the semver update 🥲 I'm using it like this: import { createAzure } from "@ai-sdk/azure"
export const azure = createAzure({
resourceName: env.AZURE_OPENAI_RESOURCE_NAME,
apiKey: env.AZURE_OPENAI_KEY,
})(env.AZURE_OPENAI_DEPLOYMENT_CHAT, { parallelToolCalls: false }) |
@miguelvictor That means that the Azure backend does not support it yet I think |
Feature Description
I just saw the power of new ai/rsc SDK and quite impressed with it. But it's not supported or there is no official documentation for Azure OpenAI integration with it. Will be a huge help if it is supported by it as in Azure OpenAI, users have more control over their data.
Use Case
Implementing a RAGBot with Azure OpenAI.
Additional context
No response
The text was updated successfully, but these errors were encountered: