-
-
Notifications
You must be signed in to change notification settings - Fork 6.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Curl does not limit output data size #11810
Comments
The Unix way is to pipe the output of curl into head or dd or some other
program to limit the maximum amount of data if that is an issue in your
application. --max-filesize will prevent a too-long download from even
starting, which is an optimization that a pipe can't (easily) do, but it's easy
to limit the data transferred without adding the 258th command-line option to
curl.
|
For Unix systems, I've tried piping into Moreover, I think this security feature should be doable in curl without requiring a shell. The above doesn't cover Windows of course. |
Maybe we should make |
diff --git a/src/tool_cb_wrt.c b/src/tool_cb_wrt.c
index 2f8c6ac..e662a35 100644
--- a/src/tool_cb_wrt.c
+++ b/src/tool_cb_wrt.c
@@ -231,6 +231,10 @@ size_t tool_write_cb(char *buffer, size_t sz, size_t nmemb, void *userdata)
}
}
+ if(config->max_filesize && (outs->bytes + bytes > config->max_filesize)) {
+ return CURLE_FILESIZE_EXCEEDED;
+ }
+
#ifdef WIN32
fhnd = _get_osfhandle(fileno(outs->stream));
/* if windows console then UTF-8 must be converted to UTF-16 */ It's documented to do nothing if the file size is not known but I think there's flexibility to remove that because of the way that option is used. |
I think so too. Will you make a PR out this and we can give it a go? |
- Return CURLE_FILESIZE_EXCEEDED when the received data exceeds the maximum file size. This is to handle those cases where the filesize is not known in advance (eg chunked encoding) but the bytes received exceeds the max file size. Prior to this change --max-filesize had no effect when the file size was not provided by the server. Reported-by: Elliot Killick Fixes curl#11810 Closes #xxxx
Previously it would only stop them from getting started if the size is known to be too big then. Update the libcurl and curl docs accordingly. Fixes #11810 Reported-by: Elliot Killick
Previously it would only stop them from getting started if the size is known to be too big then. Update the libcurl and curl docs accordingly. Fixes #11810 Reported-by: Elliot Killick
Thank you curl team for the fast response and turnaround on this issue! I find the description of this issue coincides with CWE-400. Do you think it's worth advising users with a minor CVE? Due to curl's popularity, there are probably cases (e.g. on an embedded device with few resources and no GNU coreutils/Bash) where an automated script could run into this weakness with no easily portable way to fix it on their side until now. Thanks so much again. |
It's working as documented, so I wouldn't consider this a security problem.
There are ways to avoid filling up a storage partition without this feature, so
if this could have been an issue people could have used other means.
|
Previously it would only stop them from getting started if the size is known to be too big then. Update the libcurl and curl docs accordingly. Fixes curl#11810 Reported-by: Elliot Killick Assisted-by: Jay Satiro Closes curl#11820
Would you consider it a security problem in the application then due to it using curl incorrectly? Sure, there's always going to be other means in the form of some hack. However, the fact is that most people aren't going to do that and as a result write insecure applications. |
There may be security problems in applications using curl, no question. curl
gives people 1001 ways to shoot themselves in the foot if they so desire.
That's why we have a libcurl-security document to try to help application
authors to avoid doing so. This exact issue is mentioned there, in fact (see
the Denial of Service section).
Given that curl is working exactly as advertised in this case, and it's
advertising just what we expect it to, I don't see what we'd be raising a
security vulnerability against.
|
From the libcurl-security document:
This document only provides a solution for libcurl not the My point is that there was an unreasonably large expectation on developers using the While |
Several curl team members have repeated the same message many times: this is curl working as advertised If this causes a security problem, then that problem is in the user's backyard and you should file a CVE for that product/service. It is not a security problem for curl itself. Filing a CVE for curl working as intended and documented is not helpful to anyone. |
I believe there's been a misunderstanding. The fact that Furthermore, there is no robust (implemented POSIX) solution to the problem should a developer want to include this security control it in their script. It's like a known vulnerability with no patch or mitigation available for all platforms. Prior to this issue, if I were to raise this problem for another product/service as you recommended then I may not have had any patch that I could give the dev team to fix this issue. If the Not having this security control always be possible is unacceptable because |
You can continue to repeat your stance, and we can continue to repeat ours. It's not very productive though. |
- Patch DoS vulnerability where Microsoft servers could send us data forever until OOM or disk space fills up. This issue could not feasibly be patched until now: curl/curl#11810 - Improve handle_curl_error - Update win11x64 checksum - Organize assets into their own folder - Add attribution and copyright in download functions - Update copyright year
I did this
Consider the following:
test="$(curl https://example.com)"
example.com
could cause curl to download a very large amount of data which could lead to an out of memory scenario when put into the shell variabletest
.A similar DoS could happen if writing to the disk by filling up disk space to its entirety.
Both of these could completely lock up a system and cause other applications to abort (or crash if they're written poorly). For memory exhaustion, a system with OOM reaper may also eventually take action.
I expected the following
Considering this is such a basic security issue, curl should have a built-in option to limit its output size from
stdout
/-o
/--output
. Once the hard limit for output size has been reached, curl should stop reading from the network and terminate with curl error23
or probably better would be a new error code to indicate this exact error.I propose one of
--max-size
(similar to--max-time
),--max-output
, or--max-out-size
(similar to--max-filesize
which doesn't serve as a hard limit) to stay aligned with option names already existing in curl.curl/libcurl version
curl 8.0.1
operating system
Fedora Linux 38
The text was updated successfully, but these errors were encountered: