Skip to content

Commit

Permalink
streams/openai-stream: allow returning strings in function callback (#…
Browse files Browse the repository at this point in the history
  • Loading branch information
MaxLeiter committed Jul 13, 2023
1 parent 8ca6e39 commit e361114
Show file tree
Hide file tree
Showing 4 changed files with 12 additions and 3 deletions.
5 changes: 5 additions & 0 deletions .changeset/sharp-hairs-itch.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'ai': patch
---

OpenAI functions: allow returning string in callback
1 change: 1 addition & 0 deletions docs/pages/docs/guides/functions.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,7 @@ You can then choose how you want to handle each function call: on the server or

On the server, you can pass an `experimental_onFunctionCall` callback to the `OpenAIStream`, which will be called when the model calls a function.
In order to support recursively calling functions and to construct the message context in the nested OpenAI calls, you can use `createFunctionCallMessages` to get the "assistant" and "function" messages.
You can also return a string which will be sent to the client as the "assistant" message (or returned back to the model as a response to a recursive function call).

```ts {2,3,4,15,17,21,22}
const stream = OpenAIStream(response, {
Expand Down
1 change: 0 additions & 1 deletion packages/core/streams/ai-stream.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@ import {
type ParsedEvent,
type ReconnectInterval
} from 'eventsource-parser'
import { CreateMessage } from '../shared/types'

export interface FunctionCallPayload {
name: string
Expand Down
8 changes: 6 additions & 2 deletions packages/core/streams/openai-stream.ts
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ export type OpenAIStreamCallbacks = AIStreamCallbacks & {
* // ... run your custom logic here
* const result = await myFunction(functionCallPayload)
*
* // Ask for another completion
* // Ask for another completion, or return a string to send to the client as an assistant message.
* return await openai.createChatCompletion({
* model: 'gpt-3.5-turbo-0613',
* stream: true,
Expand All @@ -48,7 +48,7 @@ export type OpenAIStreamCallbacks = AIStreamCallbacks & {
createFunctionCallMessages: (
functionCallResult: JSONValue
) => CreateMessage[]
) => Promise<Response | undefined>
) => Promise<Response | undefined | void | string>
}

/**
Expand Down Expand Up @@ -270,6 +270,10 @@ function createFunctionCallTransformer(
// so we just return the function call as a message
controller.enqueue(textEncoder.encode(aggregatedResponse))
return
} else if (typeof functionResponse === 'string') {
// The user returned a string, so we just return it as a message
controller.enqueue(textEncoder.encode(functionResponse))
return
}

// Recursively
Expand Down

0 comments on commit e361114

Please sign in to comment.