-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Description
Hi,
I have some OpenMAX code that tunnels video_render and egl_render directly (without a scheduler or anything else between both). video_render successfully sets the output format, egl_render is successfully tunneled to the video_render output port, EGLImages are passed to the egl_render output port (via OMX_FillThisBuffer()) and both components are in Executing state. No errors or anything.
The decoder also continues to consume buffers and "vcdbg log msg" shows the decoder generates lots of output frames and egl_render receives them:
758573.609: egl_renderRIL:display image(f47afe8) state 4
758573.624: egl_renderRIL:display image, dropping input image(f47af68)
However egl_render never calls FillBufferDone.
hello_videocube works on my device with the same movie and the main difference I see in the logs is that at some point, right before the first frame is shown, OMX_IndexConfigBrcmEGLImageMemHandle is set on the egl_render output port. This never happens with my code, and is also called from nowhere in hello_videocube.
From where is this called and what is necessary to have it called?
I see that in hello_videocube the texture from which the EGLImage is created is always bound to the texture rendering unit and it is drawn continously as fast as possible. This does not happen in my code and can't for various reasons (but the texture is of course still allocated).
Is OMX_IndexConfigBrcmEGLImageMemHandle somehow called from the GL side whenever something tries to access the EGLImage and only then egl_render fills the EGLImage and emits FillBufferDone?
Would it be possible to change egl_render to just emit FillBufferDone and fill the EGLImage whenever a frame is ready instead?