help with extending GPUImageMovie to AVComposition #220

Open
amitm02 opened this Issue Jun 25, 2012 · 5 comments

Comments

Projects
None yet
3 participants
@amitm02

amitm02 commented Jun 25, 2012

i've tried to extend GPUImageMovie to AVComposition, through using AVAssetReaderVideoCompositionOutput sd the frames input.
but when reaching:

- (void)processMovieFrame:(CMSampleBufferRef)movieSampleBuffer {
..
CVReturn err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, coreVideoTextureCache, movieFrame, NULL, GL_TEXTURE_2D, GL_RGBA, bufferWidth, bufferHeight, GL_BGRA, GL_UNSIGNED_BYTE, 0, &texture);
...

i get:
Failed to create IOSurface image (texture)
Movie CVOpenGLESTextureCacheCreateTextureFromImage failed (error: -6683)

which turn out to be:
kCVReturnPixelBufferNotOpenGLCompatible -6683
The pixel buffer is not compatible with OpenGL due to an unsupported buffer size, pixel format, or attribute.
Available in iOS 4.0 and later

any ideas?
thanks

@BradLarson

This comment has been minimized.

Show comment Hide comment
@BradLarson

BradLarson Jun 28, 2012

Owner

I'm not sure that I follow what you're trying to do here. Could you provide a little more detail?

Owner

BradLarson commented Jun 28, 2012

I'm not sure that I follow what you're trying to do here. Could you provide a little more detail?

@amitm02

This comment has been minimized.

Show comment Hide comment
@amitm02

amitm02 Jun 28, 2012

yes,
i would like to use AVComposition (an extended AVAsset that contains mixing of video and audio tracks) as a source for GPUImageMovie.
the reason i would like to do son is that it solves the synchronization issue of combining of 2 video tracks + some pre-made transitions between videos in the same track.

i.e:
instead of the regular GPUImageMovie chain:
url -> avasset -> avassetReader -> AVAssetReaderOutput -> CMSampleBufferRef -> your magic stuff.

i would like to use the following chain:
AVComposition -> AVAssetReader -> AVAssetReaderVideoCompositionOutput -> CMSampleBufferRef -> your magic stuff.

but as written in my above comment, it fails somewhere in the OpenGL functions.
thanks

amitm02 commented Jun 28, 2012

yes,
i would like to use AVComposition (an extended AVAsset that contains mixing of video and audio tracks) as a source for GPUImageMovie.
the reason i would like to do son is that it solves the synchronization issue of combining of 2 video tracks + some pre-made transitions between videos in the same track.

i.e:
instead of the regular GPUImageMovie chain:
url -> avasset -> avassetReader -> AVAssetReaderOutput -> CMSampleBufferRef -> your magic stuff.

i would like to use the following chain:
AVComposition -> AVAssetReader -> AVAssetReaderVideoCompositionOutput -> CMSampleBufferRef -> your magic stuff.

but as written in my above comment, it fails somewhere in the OpenGL functions.
thanks

@djromero

This comment has been minimized.

Show comment Hide comment
@djromero

djromero Jul 4, 2012

I have this change working (I'll commit it eventually) and if I remember correctly you need to make a change in GPUImageMovieWriter too.

Replace:

    [assetWriterVideoInput requestMediaDataWhenReadyOnQueue:videoQueue usingBlock:videoInputReadyCallback];

With:

    [assetWriterVideoInput requestMediaDataWhenReadyOnQueue:videoQueue usingBlock:^{
        while ([assetWriterVideoInput isReadyForMoreMediaData]) {
                    videoInputReadyCallback();
        }
    }

djromero commented Jul 4, 2012

I have this change working (I'll commit it eventually) and if I remember correctly you need to make a change in GPUImageMovieWriter too.

Replace:

    [assetWriterVideoInput requestMediaDataWhenReadyOnQueue:videoQueue usingBlock:videoInputReadyCallback];

With:

    [assetWriterVideoInput requestMediaDataWhenReadyOnQueue:videoQueue usingBlock:^{
        while ([assetWriterVideoInput isReadyForMoreMediaData]) {
                    videoInputReadyCallback();
        }
    }
@BradLarson

This comment has been minimized.

Show comment Hide comment
@BradLarson

BradLarson Jul 6, 2012

Owner

Out of curiosity, madmw, why do you use a while() loop instead of an if() check within your block?

Owner

BradLarson commented Jul 6, 2012

Out of curiosity, madmw, why do you use a while() loop instead of an if() check within your block?

@djromero

This comment has been minimized.

Show comment Hide comment
@djromero

djromero Jul 25, 2012

I don't know the ultimate reason. When using AVComposition to create the asset it doesn't work unless you use that while() loop. I guess AVComposition has different metadata than AVURLAsset.

According to requestMediaDataWhenReadyOnQueue:usingBlock: documentation: "The block should append media data to the input either until the input's readyForMoreMediaData property becomes NO or until there is no more media data to supply..." and they provide an example using the while.

I don't know the ultimate reason. When using AVComposition to create the asset it doesn't work unless you use that while() loop. I guess AVComposition has different metadata than AVURLAsset.

According to requestMediaDataWhenReadyOnQueue:usingBlock: documentation: "The block should append media data to the input either until the input's readyForMoreMediaData property becomes NO or until there is no more media data to supply..." and they provide an example using the while.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment