Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do I do a Streming Response? #6

Closed
DiogenesBR opened this issue Jun 12, 2023 · 2 comments
Closed

How do I do a Streming Response? #6

DiogenesBR opened this issue Jun 12, 2023 · 2 comments

Comments

@DiogenesBR
Copy link

DiogenesBR commented Jun 12, 2023

Hi,

I'm trying to implement a streaming response using

from streamlit_chat import message

Like in this blog post
https://medium.com/@avra42/how-to-stream-output-in-chatgpt-style-while-using-openai-completion-method-b90331c15e85

But I don't know if its possible.

In Gradio Chatbot you need to keep updating the same message.
https://gradio.app/creating-a-chatbot/#add-streaming-to-your-chatbot

@DiogenesBR DiogenesBR changed the title Streming Response How do I do a Streming Response? Jun 12, 2023
@Ajaypawar02
Copy link

I also had the same query , is there any update regarding this

@sfc-gh-jcarroll
Copy link
Contributor

Yes, it's possible. You can use the native Chat UI instead of installing streamlit_chat custom component. Here's an example:

import openai
import streamlit as st

# Initialize the chat messages history
openai.api_key = st.secrets.OPENAI_API_KEY
if "messages" not in st.session_state:
    st.session_state.messages = [{"role": "assistant", "content": "How can I help you?"}]

# Prompt for user input and save
if prompt := st.chat_input():
    st.session_state.messages.append({"role": "user", "content": prompt})

# display the existing chat messages
for message in st.session_state.messages:
    with st.chat_message(message["role"]):
        st.write(message["content"])

# If last message is not from assistant, we need to generate a new response
if st.session_state.messages[-1]["role"] != "assistant":
    with st.chat_message("assistant"):
        response = ""
        resp_container = st.empty()
        for delta in openai.ChatCompletion.create(
            model="gpt-3.5-turbo",
            messages=st.session_state.messages,
            stream=True,
        ):
            response += delta.choices[0].delta.get("content", "")
            resp_container.markdown(response)

        st.session_state.messages.append({"role": "assistant", "content": response})

Hope it helps!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants