Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(openai_dart): Fetch requests with big payloads dropping connection #226

Merged
merged 1 commit into from
Nov 17, 2023

Conversation

davidmigloz
Copy link
Owner

The fetch_client that openai_dart uses when targeting web maps keepalive to BaseRequest.persistentConnection
which is true by default.

Fetch spec says that maximum request size with keepalive flag is 64KiB:

4.5. HTTP-network-or-cache fetch

8.10.5: If the sum of contentLength and inflightKeepaliveBytes is greater
than 64 kibibytes, then return a network error.

Source: Fetch. Living Standard — Last Updated 19 June 2023

Therefore if the request is larger than 64KiB (this includes some other data, so effectively recommended check is 60KiB) we must explicitly set BaseRequest.persistentConnection
to false, otherwise request will fail.

@davidmigloz davidmigloz added t:bug Something isn't working p:openai_dart openai_dart package. labels Nov 17, 2023
@davidmigloz davidmigloz added this to the v0.1.0 milestone Nov 17, 2023
@davidmigloz davidmigloz self-assigned this Nov 17, 2023
@davidmigloz davidmigloz merged commit 1e77109 into main Nov 17, 2023
1 check passed
@davidmigloz davidmigloz deleted the fetch-bug branch November 17, 2023 21:28
KennethKnudsen97 pushed a commit to KennethKnudsen97/langchain_dart that referenced this pull request Apr 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
p:openai_dart openai_dart package. t:bug Something isn't working
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

None yet

1 participant