Skip to content

Commit

Permalink
Resolve function stream parsing error when provided a specific functi…
Browse files Browse the repository at this point in the history
…on (#334)

Fixes #329 

The reasoning why this works is similar to why trimStartOfStreamHelper also works. parseOpenAIStream maintains a temporary local state of each complete streamed message from the start chunk to end chunk.


Does this need a changeset?
  • Loading branch information
shametim committed Jul 15, 2023
1 parent 31ff8c1 commit 561a49a
Show file tree
Hide file tree
Showing 4 changed files with 46 additions and 7 deletions.
5 changes: 5 additions & 0 deletions .changeset/empty-bees-switch.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'ai': patch
---

Providing a function to `function_call` request parameter of the OpenAI Chat Completions API no longer breaks OpenAI function stream parsing.
25 changes: 21 additions & 4 deletions packages/core/streams/openai-stream.ts
Original file line number Diff line number Diff line change
Expand Up @@ -60,9 +60,8 @@ export type OpenAIStreamCallbacks = AIStreamCallbacks & {
*/
function parseOpenAIStream(): (data: string) => string | void {
const trimStartOfStream = trimStartOfStreamHelper()
let isFunctionStreamingIn: boolean
return data => {
const json = JSON.parse(data)

/*
If the response is a function call, the first streaming chunk from OpenAI returns the name of the function like so
Expand Down Expand Up @@ -115,7 +114,7 @@ function parseOpenAIStream(): (data: string) => string | void {
...
Finally, the last chunk has a `finish_reason` of `function_call`:
Finally, the last chunk has a `finish_reason` of either `function_call`:
{
...
Expand All @@ -126,6 +125,17 @@ function parseOpenAIStream(): (data: string) => string | void {
}]
}
or `stop`, when the `function_call` request parameter
is specified with a particular function via `{\"name\": \"my_function\"}`
{
...
"choices": [{
"index": 0,
"delta": {},
"finish_reason": "stop"
}]
}
With the implementation below, the client will end up getting a
response like the one below streamed to them whenever a function call
Expand All @@ -138,7 +148,9 @@ function parseOpenAIStream(): (data: string) => string | void {
}
}
*/
const json = JSON.parse(data)
if (json.choices[0]?.delta?.function_call?.name) {
isFunctionStreamingIn = true
return `{"function_call": {"name": "${json.choices[0]?.delta?.function_call.name}", "arguments": "`
} else if (json.choices[0]?.delta?.function_call?.arguments) {
const argumentChunk: string =
Expand All @@ -154,7 +166,12 @@ function parseOpenAIStream(): (data: string) => string | void {
.replace(/\f/g, '\\f') // Escape form feeds

return `${escapedPartialJson}`
} else if (json.choices[0]?.finish_reason === 'function_call') {
} else if (
(json.choices[0]?.finish_reason === 'function_call' ||
json.choices[0]?.finish_reason === 'stop') &&
isFunctionStreamingIn
) {
isFunctionStreamingIn = false // Reset the flag
return '"}}'
}

Expand Down
11 changes: 11 additions & 0 deletions packages/core/tests/snapshots/openai-chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -334,3 +334,14 @@ export const chatCompletionChunksWithFunctionCall = [
choices: [{ index: 0, delta: {}, finish_reason: 'function_call' }]
}
]

export const chatCompletionChunksWithSpecifiedFunctionCall = [
...chatCompletionChunksWithFunctionCall,
{
id: 'chatcmpl-7WBy19k4tnzMa0svAIAqkqeIaKZh8',
object: 'chat.completion.chunk',
created: 1687906853,
model: 'gpt-3.5-turbo-0613',
choices: [{ index: 0, delta: {}, finish_reason: 'stop' }] // finish_reason is 'stop' whenever you provide a function to function_call parameter
}
]
12 changes: 9 additions & 3 deletions packages/core/tests/utils/mock-service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@ import { ServerResponse, createServer } from 'node:http'

import {
chatCompletionChunks,
chatCompletionChunksWithFunctionCall
chatCompletionChunksWithFunctionCall,
chatCompletionChunksWithSpecifiedFunctionCall
} from '../snapshots/openai-chat'

async function flushDataToResponse(
Expand Down Expand Up @@ -43,7 +44,8 @@ export const setup = () => {
const type = req.headers['x-mock-type'] || 'chat' || 'func_call'

switch (type) {
case 'func_call': // new case
case 'func_call':
case 'func_call_with_specified_function':
switch (service) {
case 'openai':
res.writeHead(200, {
Expand All @@ -53,9 +55,13 @@ export const setup = () => {
})
res.flushHeaders()
recentFlushed = []
const mock =
type === 'func_call_with_specified_function'
? chatCompletionChunksWithSpecifiedFunctionCall
: chatCompletionChunksWithFunctionCall
flushDataToResponse(
res,
chatCompletionChunksWithFunctionCall.map(
mock.map(
value =>
new Proxy(
{ value },
Expand Down

0 comments on commit 561a49a

Please sign in to comment.