Skip to content
This repository was archived by the owner on Nov 27, 2025. It is now read-only.

Conversation

@dglazkov
Copy link
Contributor

Here's a streaming completion implementation. I tried to stick to the spirit of the codebase.

PTAL.

Fixes #7.

@dglazkov
Copy link
Contributor Author

Here's a working example: https://github.com/dglazkov/openai-fetch-fun

@rileytomasek
Copy link
Collaborator

Thanks @dglazkov — this is awesome! I will work on reviewing it ASAP.

Copy link
Contributor

@transitive-bullshit transitive-bullshit left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this looks really awesome @dglazkov 🔥

My main question would be: how does this work from Vercel edge functions (and edge functions in general)? Where you want to stream the response from openai but you don't want to process it within the edge function handler itself.

I don't think the current solution handles that, and it's an important use case for a lot of apps where openai-fetch comes into play. Other apps may use openai or the REST API directly, but openai-fetch sweet spot is in serverless and edge use cases (at least for now). So just wanna make sure that the streaming implementation really works well with that.

really really great work on this, though. and we may want to just merge as is @rileytomasek since imho it's already a huge improvement.

src/streaming.ts Outdated
this.onend?.();
return;
}
const parsed = JSON.parse(content);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

def add a try / catch around JSON.parse

I've seen examples in the wild with https://github.com/transitive-bullshit/chatgpt-api where this unexpectedly fails

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done!

src/streaming.ts Outdated
}
const parsed = JSON.parse(content);
this.onchunk?.({
completion: parsed.choices[0].text || '',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you should just pass the parsed content directly instead of adding additional metadata

this.onchunk?.(parsed)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wanted to make sure this stays in spirit with the rest of the API, so I kept the shape of the object the same as the response in non-streaming completion.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I think it's best to keep it consistent with the rest of the API.

@dglazkov
Copy link
Contributor Author

@transitive-bullshit this code will work in Node 18+, Web main thread and Web workers. The latter includes most edge workers that I know of and deno.

@rileytomasek
Copy link
Collaborator

rileytomasek commented Apr 18, 2023

Do you mind adding streamChatCompletion() for completeness? It may be confusing for one to support streaming, but not the other.

The Vercel edge functions and app directory have a lot of bugs with standards compliant implementations because they are modifying fetch for caching and other things. Did you (or can you) create a simple test to make sure the streaming works with them? As @transitive-bullshit said, it's not the end of the world if not because this is a huge improvement as is and Vercel support can be added later since the API is unlikely to change because it should just be implementation changes.

const DEFAULT_BASE_URL = 'https://api.openai.com/v1';

export interface FetchOptions extends Options {
export interface FetchOptions extends Omit<Options, 'credentials'> {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for catching this

@dglazkov
Copy link
Contributor Author

I'll add the streamChatCompletion. I do want to try it with the edge functions, but I haven't gotten to that yet.

@transitive-bullshit
Copy link
Contributor

transitive-bullshit commented Apr 19, 2023

Excellent work @dglazkov 🔥

@dglazkov
Copy link
Contributor Author

Added streaming chat completion support.

@ralyodio
Copy link

when merge?

@rileytomasek
Copy link
Collaborator

🥳 Thank you very much @dglazkov! I'm going to release a new version with this now.

@rileytomasek rileytomasek merged commit ffa66ac into dexaai:master Apr 26, 2023
@rileytomasek
Copy link
Collaborator

It has been released as v1.2.0: https://github.com/rileytomasek/openai-fetch/releases/tag/v1.2.0

cc @ralyodio

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Development

Successfully merging this pull request may close these issues.

Add streaming support

4 participants