Replies: 3 comments 4 replies
-
Answer generated by a 🤖 AnswerHello, Based on the context provided, it seems you want to return the streaming data from In the LangChain codebase, the To use this with Flask-SocketIO's from flask_socketio import SocketIO, emit
socketio = SocketIO(app)
@app.route('/stream')
def stream_data():
def generate():
llm = BaseLLM() # Initialize your LLM here
input = LanguageModelInput() # Provide your input here
config = RunnableConfig() # Provide your config here
for chunk in llm.stream(input, config):
yield chunk
return Response(generate(), mimetype='text/plain')
@socketio.on('start_stream')
def handle_start_stream():
for chunk in stream_data():
emit('stream_data', {'data': chunk}) In this example, the Please note that you'll need to replace the placeholders in the Also, there was a similar issue titled Support for streaming when using LLMchain? which was solved by setting I hope this helps! If you have any other questions, feel free to ask. SourcesThis response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. |
Beta Was this translation helpful? Give feedback.
-
I followed your suggestion and implemented the following using your socketio code as a basis and enabling streaming in langchain. Streaming is succesful in the console but fails to connect with socketio, see below
However I got the following error
Any ideas? |
Beta Was this translation helpful? Give feedback.
-
Hope this helps. Had to implement custom handler with a queue to store tokens and a thread in Flask. Also posted my response on Stack Overflow: https://stackoverflow.com/questions/76284412/stream-a-response-from-langchains-openai-with-pyton-flask-api/78411799#78411799
|
Beta Was this translation helpful? Give feedback.
-
Hi Everone Hope You all doing well.
i am using this
LLMChain.run("hi i am Max)
function for streaming text when streaming start it just print in colab or .py module terminal.But i want like this.
i am returning this streaming data in Flasksocketio ``` emit('response' , chunk, room=id)
As i can do for chatgpt but here i am stuck the above code show twice stream text data
if i remove for loop then it still print output but i want to return stream data like i use chatgpt api.
Any Help would be appreciated
Beta Was this translation helpful? Give feedback.
All reactions