Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handling Closed Connections #76

Closed
awtkns opened this issue May 31, 2023 · 8 comments
Closed

Handling Closed Connections #76

awtkns opened this issue May 31, 2023 · 8 comments
Assignees
Labels
feature New feature or request

Comments

@awtkns
Copy link
Contributor

awtkns commented May 31, 2023

Recently we merged some changes into reworkd/AgentGPT#631 which enabled streaming on AgentGPT using lanarky!

After merging, I noticed an increase error rate due to what I assume is lanarky not closing the connection to openai if the client (React in our cases) closes the SSE stream early. This happens, for example, when the user closes the page while in the middle of a chat completion being streamed.

I was wondering if there was a way / best practice to close the connection to openai in the case mentioned above? Error details below for context.

Sample Implementation

from typing import List

from fastapi.responses import StreamingResponse as FastAPIStreamingResponse
from lanarky.responses import StreamingResponse
from langchain import LLMChain

from reworkd_platform.web.api.agent.model_settings import ModelSettings, create_model

def summarize(
    model_settings: ModelSettings, goal: str, query: str, snippets: List[str]
) -> FastAPIStreamingResponse:

    chain = LLMChain(
        llm=create_model(model_settings, streaming=True), prompt=summarize_prompt
    )

    return StreamingResponse.from_chain(
        chain,
        {"goal": goal, "query": query, "snippets": snippets},
        media_type="text/event-stream",
    )

Error

Unclosed connection
client_connection: Connection<ConnectionKey(host='api.openai.com', port=443, is_ssl=True, ssl=None, proxy=None, proxy_auth=None, proxy_headers_hash=None)>
Screenshot 2023-05-30 at 11 27 38 PM

Thanks for the amazing work! You should setup github sponsors 😉

@awtkns awtkns added the feature New feature or request label May 31, 2023
@ajndkr
Copy link
Owner

ajndkr commented May 31, 2023

Hey @awtkns! Really glad to see this library being used in a bigger project.

I don't have a solution right now but I will start investigating. I believe this is a critical feature for all users.

P.S. I'll look into github sponsors as well 😄

@ajndkr
Copy link
Owner

ajndkr commented May 31, 2023

@awtkns i've published a new version with a possible fix:

pip install "lanarky[openai]"

let me know if the error rate reduces. I will keep this issue open until then.

@awtkns
Copy link
Contributor Author

awtkns commented Jun 1, 2023

Screenshot 2023-05-31 at 5 43 05 PM

Unfortunately after upgrading to 0.7.5 the issue still persists. I did not install with lanarky[openai] as we already had the openai dependancy. Could that be the issue?

@ajndkr thanks for all your help :)

@ajndkr
Copy link
Owner

ajndkr commented Jun 1, 2023

I don’t think missing dependencies is an issue here. Manually setting the session object and then closing it at the end should have worked.

I’ll keep investigating for a better solution. In the meantime, it’ll be good to know if the error rate has atleast decreased with the new version, if not resolved completely.

@ajndkr ajndkr added help wanted Extra attention is needed and removed awaiting-response labels Jun 1, 2023
@awtkns
Copy link
Contributor Author

awtkns commented Jun 1, 2023

image

Unfortunately the error rate has not seem to go down. 😢

This is how we are receiving the stream in the front end. We are using fetch here instead of eventsources as the streaming response didn't seem to work with JS eventsource.

@ajndkr
Copy link
Owner

ajndkr commented Jun 1, 2023

Thanks for the update! It appears we are dealing with nested awaitable functions which are unaffected when the session is closed.

I hope to find a solution soon 🤞

@ajndkr
Copy link
Owner

ajndkr commented Jun 1, 2023

@awtkns published a new version: pip install lanarky==0.7.6

I haven't been able to reproduce the "unclosed connection" error locally with the latest changes. So I'm really hoping we've arrived at a solution.

@ajndkr ajndkr added awaiting-response and removed help wanted Extra attention is needed labels Jun 1, 2023
@awtkns
Copy link
Contributor Author

awtkns commented Jun 4, 2023

I can confirm it looks like connection are being closed properly now! Thanks for your help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants