Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

First class support for @azure/openai library #618

Closed
cahaseler opened this issue Oct 3, 2023 · 5 comments · Fixed by #912
Closed

First class support for @azure/openai library #618

cahaseler opened this issue Oct 3, 2023 · 5 comments · Fixed by #912

Comments

@cahaseler
Copy link

cahaseler commented Oct 3, 2023

Feature Description

openai-node documents specifically call out the Azure OpenAI api as having subtly different behaviors compared to the OpenAI api, and recommend using the @azure/openai library instead. See: https://github.com/openai/openai-node#azure-openai

Azure OpenAI
An example of using this library with Azure OpenAI can be found here.

Please note there are subtle differences in API shape & behavior between the Azure OpenAI API and the OpenAI API, so using this library with Azure OpenAI may result in incorrect types, which can lead to bugs.

See @azure/openai for an Azure-specific SDK provided by Microsoft.

The @azure/openai SDK returns an AsyncIterable<ChatCompletions> object, which can't be used to create an OpenAIStream.

Here's how I use it:

const events = client.listChatCompletions(
      deploymentId,
      tunedMessagePrompt,
      { maxTokens: 128, stream: true, temperature: 0.3 }
    )

Use Case

White technically you can use the openai package, it's not recommended and given that the APIs are likely to continue to diverge I'd rather not be troubleshooting the difficult kind of bugs this is likely to introduce.

Additional context

No response

@leolorenzoluis
Copy link

@cahaseler Why is it not recommended? Under the hood Azure just uses OpenAI API? I do agree the convergence is annoying and handling content filters with Vercel's AI SDK is not on par with Azure's SDK and not yet currently supported.

@cahaseler
Copy link
Author

All I have is the quote above from the openai-node devs encouraging people to use the MS provided one. They didn't go into exactly what bugs, I just tend to listen to package devs when they tell me I'm going to get into "subtle type inconsistency" bug territory.

I'm using the node package without issue right now, but I won't say I'm not nervous about it.

@Dagmawi-Beyene
Copy link

Dagmawi-Beyene commented Oct 24, 2023

I am using this function instead of the available one.

export async function OpenAIStream(events: any) {
  const encoder = new TextEncoder()

  const stream = new ReadableStream({
    async start(controller) {
      for await (const event of events) {
        for (const choice of event.choices) {
          const delta = choice.delta?.content;
          if (delta !== undefined) {
            controller.enqueue(encoder.encode(delta))
          }
        }
      }
      controller.close()
    }
  })

  return stream
}

@PeterAronZentai
Copy link

PeterAronZentai commented Dec 8, 2023

Requirest for the Maintainers - please consider supporting Azure OpenAI as first class citizen. Anyone who wants to seriously use GPT-3 or -4 for production purposes does it in Azure, as it is way more reliable than openai.com, to the level, that I would consider openai.com not fit for production purposes. Anytime they announce a new feature their system gets "DDOS"-ed :)

@mharrvic
Copy link

mharrvic commented Jan 4, 2024

Was able to make it work with the @azure/openai library.

Steps:

  1. Install @azure/core-auth and @azure/openai packages
  2. Add these environments AZURE_OPEN_API_BASE AZURE_OPENAI_API_KEY AZURE_OPENAI_API_DEPLOYMENT_NAME
  3. (Reference) Update code https://github.com/vercel/ai-chatbot/blob/c368a0967cc210cb11c54f1d0ed4511456c0fa3a/app/api/chat/route.ts#L1-L65 to
    import { OpenAIStream, StreamingTextResponse } from 'ai'
    import OpenAI from 'openai'
    import { auth } from '@/auth'
    import { AzureKeyCredential } from '@azure/core-auth'
    import { OpenAIClient } from '@azure/openai'
    import { Stream } from 'openai/streaming'
    
    export const runtime = 'edge'
    
    export async function POST(req: Request) {
      const endpoint = process.env.AZURE_OPEN_API_BASE || ''
      const credentials = new AzureKeyCredential(
        process.env.AZURE_OPENAI_API_KEY || ''
      )
      const openaiClient = new OpenAIClient(endpoint, credentials)
      const deploymentName = process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME || ''
    
      const json = await req.json()
      const { messages } = json
      const userId = (await auth())?.user.id
    
      if (!userId) {
        return new Response('Unauthorized', {
          status: 401
        })
      }
    
      const azure = await openaiClient.streamChatCompletions(
        deploymentName,
        messages
      )
    
      const stream = OpenAIStream(
        azure as unknown as Stream<OpenAI.Chat.Completions.ChatCompletionChunk>, {
          async onCompletion(completion) {
            // your code here
          }
        }
      )
    
      return new StreamingTextResponse(stream)
    }
    As you can see, we have to manually cast the azure's EventStream<ChatCompletions> to Stream<OpenAI.Chat.Completions.ChatCompletionChunk> to get it compatible with OpenAIStream (force casting, let me know to improve this)
  4. Then all should work as expected from this reference https://github.com/vercel/ai-chatbot

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants