Any way to support streaming of response via SSE? #336
Unanswered
MatthewLoffredo
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Hey @MatthewLoffredo so I've had a quick look at the the openai-php/client library which I know uses Guzzle and has this streaming capability and it looks like they're just using Guzzle's stream option, which is really cool! You can do this really easily in Saloon. In your connector/request, overwrite the protected function defaultConfig(): array
{
return [
'stream' => true,
];
} This will instruct the API to return with a response right away but allow you to stream the response. After that, in Saloon you can access the underlying stream using: $stream = $response->stream(); Which you can then read X bytes at a time, outputting the text similar to how ChatGPT works. Let me know if this works! |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey all, I'm trying to do something and a bit confused. How would I go about streaming the response of a request (similar to how many ai providers are doing it) using SSE or being able to execute a function callback on each stream event? I know it has something to do with $curl_info[CURLOPT_WRITEFUNCTION] but not sure how to synthesize this with Saloon. And advice would be appreciated!
Beta Was this translation helpful? Give feedback.
All reactions