-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use streaming to improve user experience #5
Labels
enhancement
New feature or request
Comments
stoerr
added a commit
that referenced
this issue
Jun 21, 2023
stoerr
added a commit
that referenced
this issue
Jun 21, 2023
stoerr
added a commit
that referenced
this issue
Jun 21, 2023
stoerr
added a commit
that referenced
this issue
Jun 22, 2023
stoerr
added a commit
that referenced
this issue
Jun 22, 2023
stoerr
added a commit
that referenced
this issue
Jun 22, 2023
stoerr
added a commit
that referenced
this issue
Jun 23, 2023
stoerr
added a commit
that referenced
this issue
Jun 23, 2023
stoerr
added a commit
that referenced
this issue
Jun 23, 2023
stoerr
added a commit
that referenced
this issue
Jun 26, 2023
stoerr
added a commit
that referenced
this issue
Jun 27, 2023
stoerr
added a commit
that referenced
this issue
Jun 27, 2023
stoerr
added a commit
that referenced
this issue
Jun 27, 2023
stoerr
added a commit
that referenced
this issue
Jun 27, 2023
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Unfortunately, ChatGPT is quite slow, so the user has to wait seconds to even tens of seconds for the response to appear. Therefore it is a must that we use the streaming API with Server-Sent Events (SSE)
https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb
https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#event_stream_format
so that the user can already see parts of the response while it's generating.
The text was updated successfully, but these errors were encountered: