Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stack exhaustion parsing a JSON file #1136

Closed
gaa-cifasis opened this issue Apr 24, 2016 · 12 comments
Closed

Stack exhaustion parsing a JSON file #1136

gaa-cifasis opened this issue Apr 24, 2016 · 12 comments

Comments

@gaa-cifasis
Copy link

gaa-cifasis commented Apr 24, 2016

Hi,

A crash caused by stack exhaustion parsing a JSON was found. It affects, at least version 1.5 as well as the last git revision. To reproduce:

$ gdb -tty=/dev/null --args jq . qcufnzxcnp.json.4167733746247029131

...

Program received signal SIGSEGV, Segmentation fault. 
0x00007ffff47fa7c2 in _IO_new_file_overflow (f=0x7ffff4b3f400 <_IO_2_1_stdout_>, ch=-1) at fileops.c:824
824 fileops.c: No such file or directory.
(gdb) bt 10
#0  0x00007ffff47fa7c2 in _IO_new_file_overflow (f=0x7ffff4b3f400 <_IO_2_1_stdout_>, ch=-1) at fileops.c:824
#1  0x00007ffff47f96a1 in _IO_new_file_xsputn (f=0x7ffff4b3f400 <_IO_2_1_stdout_>, data=<optimized out>, n=1) at fileops.c:1332
#2  0x00007ffff47eee6d in __GI__IO_fwrite (buf=<optimized out>, size=1, count=1, fp=0x7ffff4b3f400 <_IO_2_1_stdout_>) at iofwrite.c:43
#3  0x0000000000428943 in put_buf (s=0x7fffff7ff0cc " \177", len=1, fout=0x7ffff4b3f400 <_IO_2_1_stdout_>, strout=0x0, is_tty=0) at src/jv_print.c:41
#4  0x000000000042897c in put_char (c=32 ' ', fout=0x7ffff4b3f400 <_IO_2_1_stdout_>, strout=0x0, T=0) at src/jv_print.c:47
#5  0x0000000000428ab1 in put_indent (n=29145, flags=513, fout=0x7ffff4b3f400 <_IO_2_1_stdout_>, strout=0x0, T=0) at src/jv_print.c:61
#6  0x000000000042983e in jv_dump_term (C=0x7fffffffdf60, x=..., flags=513, indent=16374, F=0x7ffff4b3f400 <_IO_2_1_stdout_>, S=0x0)
    at src/jv_print.c:204
#7  0x0000000000429977 in jv_dump_term (C=0x7fffffffdf60, x=..., flags=513, indent=16373, F=0x7ffff4b3f400 <_IO_2_1_stdout_>, S=0x0)
    at src/jv_print.c:215
#8  0x0000000000429977 in jv_dump_term (C=0x7fffffffdf60, x=..., flags=513, indent=16372, F=0x7ffff4b3f400 <_IO_2_1_stdout_>, S=0x0)
    at src/jv_print.c:215
#9  0x0000000000429977 in jv_dump_term (C=0x7fffffffdf60, x=..., flags=513, indent=16371, F=0x7ffff4b3f400 <_IO_2_1_stdout_>, S=0x0)
    at src/jv_print.c:215
(More stack frames follow...)

Find attached a JSON file to reproduce it here

Regards,
Gustavo.

@pkoppstein
Copy link
Contributor

Actually, this does not appear to be a parsing error, which I think is good news:

$ jq length qcufnzxcnp.json
31

@lhh
Copy link

lhh commented May 5, 2016

This is simply running out of stack space during jv_dump_term(). I was able to make the number of recursions go up by 11,000 or so by dynamically allocating variables instead of using stack space, but at the end of the day, you can work around this (for this test case) by doing:

ulimit -s 32768

Of course, a much larger (deeper) json file will still make it crash. You can thus, make it happen far faster by doing:

ulimit -s 256

The crash does not seem to happen due to any buffer overflows or memory corruption; it's simply the kernel saying "you're out of stack space, please die". One simple way to avoid the crash itself is to do a depth-check of the tree and compare it to some value related to the stack size returned by getrlimit() and produce an error if the values are way out of whack.

A more complicated solution would involve removing recursion from jv_dump_term(), which looks like it would not be a trivial fix.

@dolmen
Copy link

dolmen commented May 18, 2016

This issue is referenced as CVE-2016-4074.

wmark added a commit to wmark/jq that referenced this issue Aug 19, 2016
This addresses stedolan#1136, and mitigates a stack exhaustion when printing
a very deeply nested term.
wmark added a commit to wmark/jq that referenced this issue Aug 19, 2016
This addresses stedolan#1136, and mitigates a stack exhaustion when printing
a very deeply nested term.
wmark added a commit to wmark/jq that referenced this issue Aug 19, 2016
This addresses stedolan#1136, and mitigates a stack exhaustion when printing
a very deeply nested term.
nicowilliams pushed a commit that referenced this issue Jan 27, 2017
This addresses #1136, and mitigates a stack exhaustion when printing
a very deeply nested term.
davidfetter pushed a commit to davidfetter/jq that referenced this issue Oct 27, 2017
This addresses stedolan#1136, and mitigates a stack exhaustion when printing
a very deeply nested term.
@zelivans
Copy link

If I understand correctly this was fixed by 83e2cf6. Suggest closing.

@nicowilliams
Copy link
Collaborator

Good point. Closed.

@taneishamitchell
Copy link

By all indications this problem is not solved. It still appears on the Common Vulnerabilities & Exposure (CVE) list.

@pc-mrousseau
Copy link

@taneishamitchell - how do you figure its not solved? Just tested this with the provided json above and it didn't crash.

bash-5.0# jq . qcufnzxcnp.json.4167733746247029131
parse error: Exceeds depth limit for parsing at line 7, column 257
bash-5.0# jq --version
jq-master-v20200428-28-g864c859e9d

@emcay
Copy link

emcay commented Feb 16, 2021

Assuming that this IS actually fixed, it is still being flagged by some tools.

Clair (and also possibly dependent scanning tools):

quay/clair#852

@thiduzz
Copy link

thiduzz commented Aug 12, 2021

Still flagged by AWS ECR Vulnerabilities scanner...

@DrStrangepork
Copy link

Confirmed, still getting flagged by AWS ECR

aws ecr describe-image-scan-findings ...
{
    "imageScanFindings": {
        "findings": [
            {
                "name": "CVE-2016-4074",
                "uri": "https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-4074",
                "severity": "HIGH",
                "attributes": [
                    {
                        "key": "package_version",
                        "value": "1.6-r1"
                    },
                    {
                        "key": "package_name",
                        "value": "jq"
                    },
                    {
                        "key": "CVSS2_VECTOR",
                        "value": "AV:N/AC:L/Au:N/C:N/I:N/A:C"
                    },
                    {
                        "key": "CVSS2_SCORE",
                        "value": "7.8"
                    }
                ]
            },

@ioricloud
Copy link

I am using this version and still getting flagged in my ECR. Is this false positive?

@anden-dev
Copy link

anden-dev commented Jun 17, 2022

I'll just put this here for others...

Subject: Amazon ECR Update to Address False Positive Findings of CVE-2020-28928

We have identified an issue with Amazon ECR basic scanning when performing scans on Alpine 3.13 through Alpine 3.16 based container images. ECR basic scanning incorrectly discovers and reports a LOW severity scan finding called CVE-2020-28928 [1], even though Amazon ECR enhanced scanning with Amazon inspector is not affected by the issue. This false positive scan finding occurs because the open-source Clair scan engine utilized by ECR for basic scanning incorrectly parses the package versions.

We have modified ECR's basic scanning engine so that it will now interpret package versions accurately to suppress incorrect reporting of the CVE-2020-28928 vulnerability. This change will be effective starting June 22, 2022. You do not need to take any action.

False positive. Should be gone on June 22nd. Enjoy explaining that to your Security guys. ;-)

UPDATE: Sorry I have got the wrong CVE.
But the case is similar. It is a false positive.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests