Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to get realtime frame data from GPUImageVideoCamera quickly without using GPUImageRawDataOutput? #2014

Closed
sjtujulian opened this issue May 26, 2015 · 9 comments

Comments

@sjtujulian
Copy link

Hi, brad,
Now I want get realtime data from the GPUImageVideoCamera. When I use the GPUImageRawDataOutput class, the app works well. However, I realize there is a glFinish() in GPUImageRawDataOutput’s method - (GLubyte *)rawBytesForImage; which is very time consuming. I need to do some process on each realtime frame , but the glFinish() takes about 32ms, which makes frame rate decreases fast. If I replace the glFinish() with glFlush in - (GLubyte *)rawBytesForImage, I can not get the realtime frame data which result in that I can not do the process on each realtime frame;

So I wonder if there are other solution to get the realtime data from GPUImageVideoCamera besides the time consuming GPUImageRawDataOutput ? what’s more, I find that GPUImageVideoCamera’s willOutputSampleBuffer maybe useful from http://w3facility.org/question/capturing-gpuimagevideocamera-or-avcapturesession-frames-in-circular-or-ring-buffer-for-instant-playback/

Can you give me some ideas about this problem? Thank you.

@BradLarson
Copy link
Owner

glFinish() is required in the GPUImageRawDataOutput to guarantee that rendering has completed up to the point where bytes are extracted in that output. Due to the deferred nature of the iOS GPUs, I wouldn't necessarily trust time profiling for CPU-side instructions to say that that instruction is what's taking 32 ms. That's just the point at which the processing will wait on the CPU until the GPU-side processing catches up. If you have a more complex video pipeline before that, glFinish() may only be pausing long enough for the rest of the GPU-side operations to complete.

If you just need to get raw bytes from the camera, don't bother with this framework and instead go directly to the sample buffer returned from the AV Foundation camera instance.

@sjtujulian
Copy link
Author

@BradLarson Thank you for your kindly reply. Actually I just need to get raw bytes (RGB format) from GPUImageVideoCamera.

In your FilterShowcase project, I find a "face detection ", which uses the GPUImageVideoCameraDelegate and implement the delegate method - (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer to get the sample buffer you referred to. The sample buffer here is the the sample buffer returned from the AV Foundation camera instance, right?

So I implement the delegate method like this, reference your reply in http://stackoverflow.com/questions/10865100/ios-get-pixel-by-pixel-data-from-camera
- (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer{
CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(cameraFrame, 0);
GLubyte* rawDataBytes=(GLubyte*)CVPixelBufferGetBaseAddress(cameraFrame);
NSLog(@"get data:%d",rawDataBytes[1]);
}

But result is wrong, the NSLog's output is always 0. And I wonder if I need to set other parameters like (AVCaptureConnection *)connection to make it run?

And the raw data get in this way is RGB or YUV?

@BradLarson
Copy link
Owner

By default, the GPUImageVideoCamera uses YUV planar data straight from the camera. I believe this makes the sample buffers you get back from that planar ones, so you might have to treat them slightly differently than standard RGB buffers. You can switch to an RGB input on the GPUImageVideoCamera by replacing the line

captureAsYUV = YES; 

with

captureAsYUV = NO;

in GPUImageVideoCamera.m. I don't have this exposed as a property at present.

That said, I didn't add the sample buffer capture, so I haven't checked to see if it still works right after the months of changes I made in the framework since it was contributed.

Again, if all you need is a raw sample buffer, and no further GPU-side processing from the framework, don't bother with GPUImage. Just use AV Foundation directly.

@sjtujulian
Copy link
Author

@BradLarson What I want to do is that each frame data get from the VideoCamera is preprocessed by my codes (like face detection), according to the detection results I will change the parameters of the following GPUImageFilters . They are all based on your framework. So I really need to get raw data from your framework.

In your GPUImageShowCase , there is also a face detection. So you also need to get the raw data to get the face features. But I am not sure how your face detection works. here is your codes

- (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer{
if (!faceThinking) {
CFAllocatorRef allocator = CFAllocatorGetDefault();
CMSampleBufferRef sbufCopyOut;
CMSampleBufferCreateCopy(allocator,sampleBuffer,&sbufCopyOut);
[self performSelectorInBackground:@selector(grepFacesForSampleBuffer:) withObject:CFBridgingRelease(sbufCopyOut)];
}
}

- (void)grepFacesForSampleBuffer:(CMSampleBufferRef)sampleBuffer{

faceThinking = TRUE;
NSLog(@"Faces thinking");
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate);
CIImage *convertedImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer options:(__bridge NSDictionary *)attachments];

@sjtujulian
Copy link
Author

@BradLarson Thank you brad, now I can get each realtime frame data . But the problem is the out put data's sort order. I suppose the output pixel data sort order is the same as the following picture
2015-05-30 4 28 36
Actually, the order is
2015-05-30 4 34 54
I need to do preprocess for each frame which should be the first order. I have tried manual assignment ,but it's a big waste of cpu resources. For each 480*640 frame data to change the order, I need to do million assignments.

So do you have any suggestion to me about how to change the output pixel order? Thank you so much!

@klop
Copy link

klop commented Mar 10, 2016

@sjtujulian did you come up with a solution?

@passchaos
Copy link

@BradLarson hi, I use GPUImageRawDataOutput to get "rawBytesForImage", but when i convert it to uiimage, just a few of the rawBytes working, code:

+ (UIImage *)imageFromGlubyte:(GLubyte *)bytes width:(NSInteger)width height:(NSInteger)height {
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, bytes, width * height * 4, nil);
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();

    CGImageRef imageRef = CGImageCreate(width, height, 8, 32, width * 4, colorSpaceRef, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst, provider, nil, NO, kCGRenderingIntentDefault);
//    CGContextRef contextRef = CGBitmapContextCreate(bytes, width, height, 8, 4 * width, colorSpaceRef, kCGImageAlphaPremultipliedLast);
//    CGContextDrawImage(contextRef, CGRectMake(0, 0, 720, 1280), imageRef);

//    UIImage *newImage = [[UIImage alloc] initWithCGImage:CGBitmapContextCreateImage(contextRef)];
    UIImage *newImage = [[UIImage alloc] initWithCGImage:imageRef];
    CGImageRelease(imageRef);
    CGColorSpaceRelease(colorSpaceRef);
    CGDataProviderRelease(provider);


    return newImage;
}

use this function in rawDataOutput's newFrameAvailableBlock. Did i miss something?

@pengbins
Copy link

@BradLarson is it OK to replace glfinish with glflush?

@colman01
Copy link

colman01 commented Apr 23, 2016

I set captureAsYUV = NO in the GPUImageVideoCamera.m file and ran the build script... got freshly build framework, hooked it up to my project and still got black and white images out of the willOutputSampleBuffer method.
Looks like I will have to use AVCapture without the GPUImage framework.

I also tried setting different video options, however this caused a crash. There were a few options available for the video option key type, however my attempts to select a colour type failed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants