Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SIGSEGV in opj_j2k_update_image_data via pdfium_test #481

Closed
gcode-importer opened this issue Mar 15, 2015 · 9 comments
Closed

SIGSEGV in opj_j2k_update_image_data via pdfium_test #481

gcode-importer opened this issue Mar 15, 2015 · 9 comments
Labels
Milestone

Comments

@gcode-importer
Copy link

Originally reported on Google Code with ID 481

What steps will reproduce the problem?
This original issue was raised in chrome https://code.google.com/p/chromium. The original
bug number is 453553.

What is the expected output? What do you see instead?
No crashes. ASAN reports  SEGV on unknown address.

What version of the product are you using? On what operating system?
Reproduced on ubuntu Linux not on Windows.

Please provide any additional information below.

In the function of opj_j2k_update_image_data, the following code doesn't check whether
the result of l_img_comp_dest->w * l_img_comp_dest->h causes overflow.

l_img_comp_dest->data = (OPJ_INT32*) opj_calloc(l_img_comp_dest->w * l_img_comp_dest->h,
sizeof(OPJ_INT32));

There is no problem in Windows because opj_malloc has different definitions shown as
below. The overflow is checked in _MSC_VER.

#ifdef _MSC_VER
#define opj_malloc(size) ((size_t)(size) >= (size_t)-0x100 ? NULL : malloc(size))
#else
#define opj_malloc(size) malloc(size)
#endif



Reported by jun_fang@foxitsoftware.com on 2015-03-15 18:02:39

@gcode-importer
Copy link
Author

Could you provide a jp2 file to reproduce the bug (we do not have currently access to
the indicated chromium bug)

Reported by detonin on 2015-03-25 15:53:54

  • Labels added: Priority-Critical, Restrict-View-CoreTeam
  • Labels removed: Priority-Medium

@gcode-importer
Copy link
Author

I extract JPX stream from a pdf file to generate a jp2 file as attached.

Reported by jun_fang@foxitsoftware.com on 2015-04-20 23:42:24


- _Attachment: [repro.jp2](https://storage.googleapis.com/google-code-attachments/openjpeg/issue-481/comment-2/repro.jp2)_

@gcode-importer
Copy link
Author

I will push a fix shortly for this one.

However, I won't close the issue right now.
Computations for offsets will be done without overflow (thus the ASan shouldn't complain).
The image will be decoded (33GB in memory in my "task manager"). This might be unwanted.
FYI, kakadu has the same behavior.

Reported by mayeut on 2015-06-03 20:53:57

  • Status changed: Started

@gcode-importer
Copy link
Author

This issue was updated by revision r3007.

Reported by mayeut on 2015-06-03 20:56:18

@mayeut
Copy link
Collaborator

mayeut commented Jul 27, 2015

Should something be done to prevent decoding of 33GB images ?
libpng has default user limits for width/height (which can be overridden if necessary).

@malaterre
Copy link
Collaborator

I was not able to retrieve the attachment (leads to 404 over here), so my comments may be inaccurate. However:

  1. We should not care about PNG restrictions, that is IMHO outside of the scope of this issue
  2. There is a way in kakadu to limit memory consumption (-flush_buffer), so the comment above is somewhat exagerated.

@mayeut
Copy link
Collaborator

mayeut commented Jul 28, 2015

@malaterre,

The attachment was not transferred because the issue was marked Restrict-View-Core-Team.
I don't see the issue list on googlecode anymore. However, browsing to source -> changes -> r3007, I can click on the issue referenced in commit. You can retrieve the image there.

I think I wasn't clear enough & this issue should probably be closed.
However, it raises another question which might need its own issue:

Shall OpenJpeg provide (optional) mechanisms to prevent decoding "huge" images ?

In the "normal" world, huge images are probably just corrupted images as this one where:
<xsiz>1049816</xsiz> is 1bit different from <width>1240</width>
This image gets decoded as a 10GB raw file.

  1. libpng was just mentioned as a library implementing such mechanisms (maybe not for the same reason). I do not say that OpenJpeg shall enforce those limits.
  2. OK. I didn't know about this option. With which tool ? I downloaded the latest free tools (kdu_expand v7.6) & I don't see this option.

@malaterre
Copy link
Collaborator

@mayeut I cannot find the flush_buffer on kdu_expand. It is present only in kdu_compress. Since kdu_expand has lot more options to control decompression (#quality, region size...), I guess this is not available directly. In any case kdu_expand should flush intermediate buffer (design of the lib) so memory consumption should not go up to 30Go as mentionned above (but again as you explained this is a different issue).

I cant say much about the corrupted image. OpenJPEG as a library will do what's told. So at application level (PDF rendering) someone should maybe states: "hey, xsize=1049816 is far too big for me to render".

2cts

@mayeut
Copy link
Collaborator

mayeut commented Jul 29, 2015

@malaterre,

OK, That's a valid answer to the question asked: "No, integrator shall check this on its own"
Closing the issue since there's an answer to the question & no more trouble when decoding the image.

@mayeut mayeut closed this as completed Jul 29, 2015
@mayeut mayeut added this to the OPJ v2.1.1 milestone Aug 27, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants