Technically, it's not an infinite loop. The actual image is 38655 wide and 16384 high. Your program will actually spit out an error if you wait long enough. It only took my computer a minute or so.
There's still a bug here, that we don't seem to notice that we're not making progress during the Huffman decode, but I don't think it's a hang. More investigation coming...
As for whether we should not try to decode 'unreasonably large images', that's issue #5050 to discuss what 'unreasonable' means and how it's expressed in API.
Input is just few bytes. Is it OK for such a small input to take so much time and memory? Looks like a zip bomb. I.e. I can force your server to spend arbitrary amount of CPU by sending just few bytes. Or it is OK for this format because of compression (e.g. you encode a huge black image)?
As I said, not making progress during the Huffman decode is still a bug, and I'm still investigating. All I'm saying is that it's not an infinite loop bug.
Run the following program on the following input:
https://drive.google.com/file/d/0B20Uwp8Hs1oCTzlVWFBmZ2lHS0k/view?usp=sharing
The program hangs (waited for several minutes, while file size is 502 bytes).
Abort stack:
Perf profile:
The code seems to be looping in processSOS.
My repository is on commit 8ac129e.
The text was updated successfully, but these errors were encountered: