Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[0.4] Adds stream support #84

Merged
merged 13 commits into from
Mar 24, 2023
Merged

[0.4] Adds stream support #84

merged 13 commits into from
Mar 24, 2023

Conversation

gehrisandro
Copy link
Collaborator

This PR adds stream support.

This resolves #19 #21 #80

Since the library is HTTP client agnostic now it was a bit more complicated than expected. See explanations below.

Here are the key changes:

Examples

Completion Create

(copy from README)

$stream = $client->completions()->createStreamed([
        'model' => 'text-davinci-003',
        'prompt' => 'Hi',
        'max_tokens' => 10,
    ]);

foreach($stream->read() as $response){
    $response->choices[0]->text;
}
// 1. iteration => 'I'
// 2. iteration => ' am'
// 3. iteration => ' very'
// 4. iteration => ' excited'
// ...

Chat Create

(copy from README)

$stream = $client->chat()->createStreamed([
    'model' => 'gpt-4',
    'messages' => [
        ['role' => 'user', 'content' => 'Hello!'],
    ],
]);

foreach($stream->read() as $response){
    $response->choices[0]->toArray();
}
// 1. iteration => ['index' => 0, 'delta' => ['role' => 'assistant'], 'finish_reason' => null]
// 2. iteration => ['index' => 0, 'delta' => ['content' => 'Hello'], 'finish_reason' => null]
// 3. iteration => ['index' => 0, 'delta' => ['content' => '!'], 'finish_reason' => null]
// ...

FineTunes Events List

(copy from README)

$stream = $client->fineTunes()->listEventsStreamed('ft-y3OpNlc8B5qBVGCCVsLZsDST');

foreach($stream->read() as $response){
    $response->message;
}
// 1. iteration => 'Created fine-tune: ft-y3OpNlc8B5qBVGCCVsLZsDST'
// 2. iteration => 'Fine-tune costs $0.00'
// ...
// xx. iteration => 'Uploaded result file: file-ajLKUCMsFPrT633zqwr0eI4l'
// xx. iteration => 'Fine-tune succeeded'

Explanations

  • I finally decided to go with dedicated methods for the stream requests: create() -> createStreamed() etc.
  • All streams return a StreamResponse object which has a method read() that returns a Generator instance for all the incoming responses.
  • As far as possible I reused the existing response objects. For Chat Create this was not possible as the response structure is very different from the non streaming response.

Async requests and PSR-18

PSR-18 sadly does not cover async requests. Therefore it is not possible to write generic code which will work with all the HTTP clients.
To work around this issue I have extended the OpenAI::factory() with a new method withAsyncRequest where a closure can be passed which will be used for async requests.
If GuzzleHttp is used there is an implementation available and it is not necessary to provide it manually.

To be honest, this is not very developer friendly but so far I had no better idea. The only thing we could do is to provide more implementations for various HTTP clients.

Example factory usage (with GuzzleHttp which is redundant because it is already baked in):

$client = OpenAI::factory()
    ->withApiKey($yourApiKey)
    ->withHttpClient($client = new \GuzzleHttp\Client([]))
    ->withAsyncRequest(fn (RequestInterface $request): ResponseInterface => $client->send($request, ['stream' => true]))
    ->make();

Thanks

Many thanks to @slavarazum for giving me a starting point here #21

@gehrisandro
Copy link
Collaborator Author

cc @GromNaN

Can you please have a look if this would work with the symphony client?

@slavarazum
Copy link

slavarazum commented Mar 23, 2023

Great job, looks like we are close!

Do we need an extra call to the read method?
Maybe it makes sense to return Generator to stay closer to native types or make StreamResponse object as iterable.

I'm a little confused about the name of the factory method withAsyncRequest for streaming requests.
For example GuzzleHttp\Client::send method is synchronous but can be streamed. Async != stream. Perhaps it would be better to call it withStreamRequest.

@jhull
Copy link

jhull commented Mar 23, 2023

Just a heads up with the streaming: the key off of choices is delta, not text (https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb)

@nunomaduro nunomaduro changed the title add stream support for "Completion Create", "Chat Create" and "FineTunes Event List" [0.4] Adds stream support Mar 24, 2023
Copy link
Contributor

@nunomaduro nunomaduro left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The testing pull request needs to be updated with this changes.

@gehrisandro
Copy link
Collaborator Author

Great job, looks like we are close!

Do we need an extra call to the read method? Maybe it makes sense to return Generator to stay closer to native types or make StreamResponse object as iterable.

I'm a little confused about the name of the factory method withAsyncRequest for streaming requests. For example GuzzleHttp\Client::send method is synchronous but can be streamed. Async != stream. Perhaps it would be better to call it withStreamRequest.

Thanks @slavarazum . @nunomaduro already updated the parts you mentioned.

@gehrisandro
Copy link
Collaborator Author

Just a heads up with the streaming: the key off of choices is delta, not text (https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb)

@jhull Thanks for your hint but the key in completion responses is still text. It only changes for chat completions from message to delta.

@vigstudio
Copy link

When will you release this PR?

@gehrisandro
Copy link
Collaborator Author

When will you release this PR?

Later today.

@nunomaduro nunomaduro merged commit 85a4753 into main Mar 24, 2023
@nunomaduro nunomaduro deleted the add-stream-support branch March 24, 2023 11:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

completions create: support stream?
6 participants