Add streaming support to text converse endpoint #101
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Frontend change: ohcnetwork/ayushma_fe#7
This pull request introduces crucial updates to the backend that enable real-time streaming support for Ayushma (AI) responses. The changes ensure that users can interact with the AI in a more engaging and dynamic manner, as tokens are displayed as soon as they are received from the AI.
ChatOpenAI
is initialized withstreaming=True
and with acallback_manager
which receives each new token as it is received from OpenAI.StreamingHttpResponse
is initialized withcontent_type="text/event-stream"
and the LLM's output is streamed to the client in the below format.start_blocking_portal
to initiate thelang_chain_helper.get_response
coroutine, which retrieves the AI responses in real time and populates thetoken_queue
accordingly.StreamingQueueCallbackHandler
which is then processed to create a streaming json output.{"chat": chat_id, "delta": delta, "message": message, "stop": stop}
wheredelta
is the new token received andmessage
is the entire message streamed so far.split_text
and refactoring the code to useChatMessageType
instead of hardcoded numbers are also made.These backend enhancements work in tandem with the frontend updates to provide a seamless, real-time streaming experience for Ayushma (AI) responses, significantly improving user engagement and interactivity.