-
-
Notifications
You must be signed in to change notification settings - Fork 6.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Document lack of method to stop uncompressed downloads #7516
Comments
On Thu, Jul 29, 2021 at 07:03:55PM -0700, 積丹尼 Dan Jacobson wrote:
I.e.,
If this option is used and the server sends an unsupported en‐
coding, curl will report an error.
Add: But no encoding is not an error.
It says "unsupported" encoding. No encoding (i.e. identity" *is* support and is
not an error.
I.e., better have a bigger disk ready, as curl gives you no way of stopping
uncompressed downloads.
If you use --compressed, then you are asking curl to give you the uncompressed
results.
|
Clarified Reported-by: Dan Jacobson Fixes #7516
Clarified Reported-by: Dan Jacobson Fixes #7516
If you want to store the data compressed then this is not the option for you. |
All I want to do is But my bug stands! There is no way to tell curl "bomb out if the server does not agree to compress the file when sending over the wire." I.e., with curl, and wget, you better have bigger disks ready, because compression is just seen as a luxury that is OK to skip if not available today. |
One probably needs a complicated two step shell script, that first starts the download with all the flags it can, to encourage compression. |
I just want to make sure I only have 2MB instead of 46MB go over the wire.
That’s up to the server to comply with, as client you can only ask for compression.
But my bug stands! There is no way to tell curl "bomb out if the server does not agree to compress the file when sending over the wire."
Thats not a bug, thats how HTTP compression and headers work. If you would like to be able to do this then that would be a new feature, and you’re welcome to supply a PR for that for us to consider.
I.e., with curl, and wget, you better have bigger disks ready, because compression is just seen as a luxury that is OK to skip if not available today.
Again, HTTP compression doesn’t affect the size of the download, as you’ve already said yourself.
|
OK, I hereby submit a FR (Feature Request), not a PR. |
If you would like to be able to do this then that would be a new feature, and you’re welcome to supply a PR for that for us to consider.
OK, I hereby submit a FR (Feature, request), not a PR.
Noted. However, just to not get your hopes up, we don’t have a concept of feature requests. We’re all volunteers and everyone works on whatever they want.
|
I don't think we should add such a feature. This seems like something so obscure it's not going to have use to anyone else, and we'd take on a maintenance cost (as we do with anything like this). The reporter could, in their own fork, modify Curl_build_unencoding_stack to flag if an encoding the server sent will be decompressed, like this: diff --git a/lib/content_encoding.c b/lib/content_encoding.c
index a84ff54..52f090d 100644
--- a/lib/content_encoding.c
+++ b/lib/content_encoding.c
@@ -1065,6 +1065,9 @@ CURLcode Curl_build_unencoding_stack(struct Curl_easy *data,
if(!encoding)
encoding = &error_encoding; /* Defer error at stack use. */
+ if(encoding != &identity_encoding && encoding != &error_encoding)
+ k->writer_will_decompress = true;
+
/* Stack the unencoding stage. */
writer = new_unencoding_writer(data, encoding, k->writer_stack);
if(!writer)
diff --git a/lib/urldata.h b/lib/urldata.h
index 1d99112..2195003 100644
--- a/lib/urldata.h
+++ b/lib/urldata.h
@@ -668,6 +668,8 @@ struct SingleRequest {
/* Content unencoding stack. See sec 3.5, RFC2616. */
struct contenc_writer *writer_stack;
+ bool writer_will_decompress;
+
time_t timeofdoc;
long bodywrites;
char *location; /* This points to an allocated version of the Location: .... and then what? I don't know. Maybe error in Curl_client_write if !data->req.writer_will_decompress and data->set.str[STRING_ENCODING] was set to "" or parse the list and check if it was set to request compression. Seems incomplete. |
Maybe the blame lays on the HTTP designers / RFC authors. "I said Grandma can only eat cooked meat, "I can only require "meat", and advise "cooked", but have no defense |
Man page says
Well I did
and --compressed got ignored, but no error was reported.
So maybe the man page should say:
like it does elsewhere.
I.e.,
Add: But no encoding is not an error.
I.e., better have a bigger disk ready, as curl gives you no way of stopping uncompressed downloads.
curl 7.74.0 (x86_64-pc-linux-gnu)
The text was updated successfully, but these errors were encountered: