Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is response streaming supported? #10

Open
sundar7D0 opened this issue May 13, 2023 · 1 comment
Open

Is response streaming supported? #10

sundar7D0 opened this issue May 13, 2023 · 1 comment

Comments

@sundar7D0
Copy link

Since the overall delay in returning the response from ChatGPT along with the context can be huge, does the slack-bot support response streaming where the tokens are returned to user as and when they are returned from the ChatGPT API using 'stream=True' argument?

@gabrielkoo
Copy link
Owner

Since the overall delay in returning the response from ChatGPT along with the context can be huge, does the slack-bot support response streaming where the tokens are returned to user as and when they are returned from the ChatGPT API using 'stream=True' argument?

Yes. You can use chat.update.

  1. Send a initial response and get the ts value of the message.
  2. Set stream=True in the openai.Completion.create call
  3. Loop through the response stream, call chat.update in each loop with the latest unfinished text response from OpenAI.

That’s it! Please feel free to create a pull request if you have done so, it will help others too!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants