-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
File uploads time out after 5 minutes (Jetstream2) #232
Comments
At the time the upload fails (i.e. approximately 5 minutes after starting the upload), I see these lines in the
Highlights (from those lines):
The timestamp (
I also see this in the
|
I do see one timeout mentioned in the Caddy documentation, whose default duration is 5 minutes. It's the
|
I don't see any 5-minute (by default) timeouts in the documentation of the |
@eecavanna @mflynn-lanl is this slated for this sprint? Can this be slotted for the next sprint? |
I think @mflynn-lanl and I will discuss this at the squad meeting on Tuesday (I missed last week's meeting) and then update this ticket based on that discussion (with one potential update being to move it to next sprint). |
History
In #218, @mflynn-lanl reported that users could not upload large files. The root cause was that Cloudflare — which, at the time, was configured in "Proxied" mode — was imposing its 500 MiB (Mebibyte) request size limit. Uploads of files smaller than 500 MiB would succeed, while uploads of files larger than 500 MiB would fail. To fix that, we reconfigured Cloudflare to be in "DNS Only" mode instead of "Proxied" mode. As a result, we were able to upload files larger than 500 MiB.
Problem
Although we can now upload files larger than 500 MiB; if the upload takes longer than 5 minutes, the upload fails and the client receives an
HTTP 408 Request Timeout
response.Task 📋
Appendix
You can convert a number of Mebibytes into Bytes at: https://www.dr-lex.be/info-stuff/bytecalc.html
You can generate "dummy" files (of a given size in Bytes) by running this command:
The text was updated successfully, but these errors were encountered: