Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use streaming to improve user experience #5

Closed
stoerr opened this issue Jun 21, 2023 · 0 comments · Fixed by #7
Closed

Use streaming to improve user experience #5

stoerr opened this issue Jun 21, 2023 · 0 comments · Fixed by #7
Assignees
Labels
enhancement New feature or request

Comments

@stoerr
Copy link
Member

stoerr commented Jun 21, 2023

Unfortunately, ChatGPT is quite slow, so the user has to wait seconds to even tens of seconds for the response to appear. Therefore it is a must that we use the streaming API with Server-Sent Events (SSE)
https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb
https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#event_stream_format
so that the user can already see parts of the response while it's generating.

@stoerr stoerr self-assigned this Jun 21, 2023
@stoerr stoerr transferred this issue from ist-dresden/composum-launch Jun 21, 2023
@stoerr stoerr added the enhancement New feature or request label Jun 21, 2023
stoerr added a commit that referenced this issue Jun 22, 2023
stoerr added a commit that referenced this issue Jun 23, 2023
stoerr added a commit that referenced this issue Jun 23, 2023
@stoerr stoerr linked a pull request Jun 23, 2023 that will close this issue
stoerr added a commit that referenced this issue Jun 26, 2023
@stoerr stoerr closed this as completed in #7 Jun 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant