Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Maximum chunk size assertion in videoio's AVI container implementation appears to be incorrect. #11126

Closed
anthonytw opened this issue Mar 20, 2018 · 8 comments · Fixed by #11146

Comments

@anthonytw
Copy link
Contributor

anthonytw commented Mar 20, 2018

System information (version)
  • OpenCV => 3.4.1
  • Operating System / Platform => Ubuntu Linux 17.10 64-bit
  • Compiler => clang-5.0
Detailed description

The AVI container implementation in the videoio module appears, to me, to have a bug. In file modules/videoio/src/container_avi.cpp the following function is defined:

std::vector<char> AVIReadContainer::readFrame(frame_iterator it)
{
    m_file_stream->seekg(it->first);

    RiffChunk chunk;
    *(m_file_stream) >> chunk;
    CV_Assert(chunk.m_size <= 0xFFFF);   // <--- ATW: Bug?

    std::vector<char> result;

    result.reserve(chunk.m_size);
    result.resize(chunk.m_size);

    m_file_stream->read(&(result[0]), chunk.m_size); // result.data() failed with MSVS2008

    return result;
}

Notice the assertion: CV_Assert(chunk.m_size <= 0xFFFF);. I was running into the issue where I was unable to open an MJPG video with the VideoCapture class after writing it with the VideoWriter class. The video seems perfectly fine, and when I remove this assertion the software works as expected.

Why is this assertion present? Has this chunk size limitation possibly been increased or removed in newer implementations? It seems this assertion either needs to be updated with a larger maximum chunk size or removed all together, but I'm not sure which. Alternatively, if this is a real limit then there may be a bug in the code that writes the AVI videos, since it seems to be using chunk sizes that trigger this assertion when the videos are read back in.

Steps to reproduce

No code. You need to get (un)lucky and generate a video that includes large chunks. I can provide a video if helpful, but this seems like more of an API / specification technicality.

@mshabunin
Copy link
Contributor

@anthonytw , this assertion has been added to satisfy static analysis tools complaining about values read from file and then used to allocate memory. So the limit is not real, probably we can omit the assertion. Feel free to provide pull request with a fix.

@alalek
Copy link
Member

alalek commented Mar 21, 2018

Just omitting an assertion will lead to memory DOS (vulnerability), so reasonable limit with 64Mb (or configurable in runtime) should be added. Current 64Kb is too small.

@anthonytw
Copy link
Contributor Author

Okay, I initiated a pull request which sets the upper threshold to 64MB.

@weiqiruan
Copy link

I also had this problem when the opencv updated from 3.0.0 to 3.4.1. The running program had an exception when the input AVI video is smaller than 64MB. Later, I found that opencv\sources\modules\videoio\src\container_avi.cpp 514line CV_Assert(chunk.m_size <= 0xFFFF); means the video file should be larger than 64MB according to the VS output logfile. Unfortunately, the 64.5MB AVI file still failed... ^_^

@jonas-kolker
Copy link

How do you get to the file containing this assertion?

@404hasbeenfound
Copy link

i have this problem too, and what i should do is omit this asseration?

@labsrob
Copy link

labsrob commented Aug 7, 2019

I don't think this bug was ever cleared even in the recent version of Opencv 3.4.2.
Testing with AVI, returned this error on one of the file: Error: Assertion failed (chunk.m_size <= 0xFFFF) in cv::AVIReadContainer::readFrame. If you have an idea how to fix this, please post here. Many thanks.

@alalek
Copy link
Member

alalek commented Aug 7, 2019

Check output of cv::getBuildInformation() for actual running version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants