-
Notifications
You must be signed in to change notification settings - Fork 24
Implement streaming completion #16
Conversation
|
Here's a working example: https://github.com/dglazkov/openai-fetch-fun |
|
Thanks @dglazkov — this is awesome! I will work on reviewing it ASAP. |
transitive-bullshit
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this looks really awesome @dglazkov 🔥
My main question would be: how does this work from Vercel edge functions (and edge functions in general)? Where you want to stream the response from openai but you don't want to process it within the edge function handler itself.
I don't think the current solution handles that, and it's an important use case for a lot of apps where openai-fetch comes into play. Other apps may use openai or the REST API directly, but openai-fetch sweet spot is in serverless and edge use cases (at least for now). So just wanna make sure that the streaming implementation really works well with that.
really really great work on this, though. and we may want to just merge as is @rileytomasek since imho it's already a huge improvement.
src/streaming.ts
Outdated
| this.onend?.(); | ||
| return; | ||
| } | ||
| const parsed = JSON.parse(content); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def add a try / catch around JSON.parse
I've seen examples in the wild with https://github.com/transitive-bullshit/chatgpt-api where this unexpectedly fails
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done!
src/streaming.ts
Outdated
| } | ||
| const parsed = JSON.parse(content); | ||
| this.onchunk?.({ | ||
| completion: parsed.choices[0].text || '', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you should just pass the parsed content directly instead of adding additional metadata
this.onchunk?.(parsed)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wanted to make sure this stays in spirit with the rest of the API, so I kept the shape of the object the same as the response in non-streaming completion.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I think it's best to keep it consistent with the rest of the API.
|
@transitive-bullshit this code will work in Node 18+, Web main thread and Web workers. The latter includes most edge workers that I know of and deno. |
|
Do you mind adding The Vercel edge functions and app directory have a lot of bugs with standards compliant implementations because they are modifying fetch for caching and other things. Did you (or can you) create a simple test to make sure the streaming works with them? As @transitive-bullshit said, it's not the end of the world if not because this is a huge improvement as is and Vercel support can be added later since the API is unlikely to change because it should just be implementation changes. |
| const DEFAULT_BASE_URL = 'https://api.openai.com/v1'; | ||
|
|
||
| export interface FetchOptions extends Options { | ||
| export interface FetchOptions extends Omit<Options, 'credentials'> { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for catching this
|
I'll add the |
|
Excellent work @dglazkov 🔥 |
|
Added streaming chat completion support. |
|
when merge? |
|
🥳 Thank you very much @dglazkov! I'm going to release a new version with this now. |
|
It has been released as v1.2.0: https://github.com/rileytomasek/openai-fetch/releases/tag/v1.2.0 cc @ralyodio |
Here's a streaming completion implementation. I tried to stick to the spirit of the codebase.
PTAL.
Fixes #7.