-
-
Notifications
You must be signed in to change notification settings - Fork 6.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
curl throws away partial downloads on retry, even if told to resume #1084
Labels
Comments
As that output suggests, this is not a bug but actually a missing feature. I would welcome to have such functionality added. |
bagder
added a commit
that referenced
this issue
Oct 24, 2016
This is now mentioned as a suggestion in the TODO. Closing this for now. |
agners
added a commit
to agners/curl
that referenced
this issue
Apr 3, 2018
If continue/resume is enabled, try to also resume when retrying due to transient connection problems. Closes curl#1084
agners
added a commit
to agners/curl
that referenced
this issue
Apr 6, 2018
If http(s) is used and retry is enabled, try resuming download in case of transient connection problems. In case the server reports that range support is not supported, fall back to the old mode where the file is truncated and download is tried from start again. Closes curl#1084
agners
added a commit
to agners/curl
that referenced
this issue
Apr 6, 2018
If http(s) is used and retry is enabled, try resuming download in case of transient connection problems. In case the server reports that range support is not supported, fall back to the old mode where the file is truncated and download is tried from start again. Closes curl#1084
agners
added a commit
to agners/curl
that referenced
this issue
Nov 7, 2018
If http(s) is used and retry is enabled, try resuming download in case of transient connection problems. In case the server reports that range support is not supported, fall back to the old mode where the file is truncated and download is tried from start again. Closes curl#1084
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Say I want to download a large file, 1 GiB perhaps. I specify
-O
.My network connection isn't very reliable, so I specify
-m
and--retry
, so it doesn't hang up.The problem is, on every retry, the output file is truncated, effectively throwing away all the work so far.
This is wasteful, and makes
--retry
quite useless.At least, if
-C -
is specified, it's pretty clear the user wants to always resume automatically.The workaround is to either not to use
--retry
& shell-wrapcurl
in afor
loop, or to usewget
.Relevant source, for anyone interested.
curl 7.50.3
The text was updated successfully, but these errors were encountered: