Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

imageFromCurrentFramebuffer sometimes nil #1522

Open
3DTOPO opened this issue Apr 18, 2014 · 37 comments

Comments

@3DTOPO
Copy link

commented Apr 18, 2014

Occasionally when I attempt to capture an image from the current frame buffer, the image returned is nil. For instance:

[myFilter useNextFrameForImageCapture];
UIImage *currentFilteredImage = [myFilter imageFromCurrentFramebuffer];

Odd thing is if I usually just try again (I have it in a while loop up to 5 attempts for now) it works. But very rarely, even after 5 attempts the returned images is still nil.

Then if I rerun the filter from scratch again, it will then just work.

If anyone could offer some insight what may be going on I certainly would appreciate it very much.

Thank you.

@BradLarson

This comment has been minimized.

Copy link
Owner

commented Apr 18, 2014

Is this from a live video source or still image?

@3DTOPO

This comment has been minimized.

Copy link
Author

commented Apr 18, 2014

On Apr 17, 2014, at 9:34 PM, Brad Larson notifications@github.com wrote:

Is this from a live video source or still image?

Thanks Brad you rule!

From a still source.

@3DTOPO

This comment has been minimized.

Copy link
Author

commented Apr 18, 2014

P.S. The only thing I can think of is sometimes the OS needs to be asked more than once for the memory or something. Note too that I mostly develop using the 7.1 Simulator, so I have not tested this new code on devices enough to see the problem occur on an actual device, but I never noticed the issue before we had to call useNextFrameForImageCapture...

P.S.S. I think GPUImage is my favorite library EVER, thank you!

@3DTOPO 3DTOPO closed this Apr 18, 2014

@3DTOPO 3DTOPO reopened this Apr 18, 2014

@BradLarson

This comment has been minimized.

Copy link
Owner

commented Apr 20, 2014

If a still image, you might need to do

[myFilter useNextFrameForImageCapture];
[whateverYouCalledThePicture processImage]; 
UIImage *currentFilteredImage = [myFilter imageFromCurrentFramebuffer];

The framebuffer ends up getting attached to the UIImage when you extract the image, so a new one needs to be regenerated if you try to pull from the filter again. Re-running -processImage will trigger that.

@jjxtra

This comment has been minimized.

Copy link
Contributor

commented Apr 23, 2014

Seems to happen with the elegance filter every time. Calling processImage multiple times has no effect. Simpler filters like swirl seem to work fine.

... Tested further and all filter groups (like GPUImageGlassSphereFilter, etc.) all fail.

@jjxtra

This comment has been minimized.

Copy link
Contributor

commented Apr 23, 2014

Reverting to the code before imageFromCurrentFramebufferWithOrientation: and going back to imageFromCurrentlyProcessedOutputWithOrientation: and still image capturing works perfectly for all filters, even on iOS 7.X.

@kissfro

This comment has been minimized.

Copy link

commented Apr 23, 2014

Re: Brad

It appears that once you do:

UIImage *currentFilteredImage = [myFilter imageFromCurrentFramebuffer];

Even if you call again:

[myFilter useNextFrameForImageCapture];
[whateverYouCalledThePicture processImage];

It still only generates a blank image. The only way I'm able to get it to work now is if I regenerate a new GPUImageView.

@henduck

This comment has been minimized.

Copy link

commented Apr 27, 2014

I posted this previously on Issue #1474 but that issue is already closed, and I realized this thread is actually the more appropriate place for it. Basically, my problem is that imageFromCurrentFramebufferWithOrientation is behaving a bit erratically.

My capture is from the video camera, using the following init:

videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront];

videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
videoCamera.horizontallyMirrorFrontFacingCamera = YES;
videoCamera.horizontallyMirrorRearFacingCamera = NO;

grayscale = [GPUImageGrayscaleFilter new];
[videoCamera addTarget:grayscale];

[videoCamera startCameraCapture];

[grayscale addTarget:imgView1];

This all seems to work fine... I can see the filtered output on the screen. And can even capture video without any problems. The issue is, when I try to capture a still frame:

[grayscale useNextFrameForImageCapture];
UIImage *img = [grayscale imageFromCurrentFramebufferWithOrientation:UIImageOrientationUp];
if(img != nil)
    NSLog(@“img was captured. Do something with it…”);
else
    NSLog(@“img is nil”);

The most common scenario is that the very first time it tries to capture, it gets the 3 second timeout (line 188 in GPUImageFilter.m). Then it typically works fine on the second attempt (returns a proper capture). All subsequent attempts then return nil for the call to [grayscale imageFromCurrentFramebufferWithOrientation:UIImageOrientationUp]; Of course, it doesn't happen exactly this way every time. Sometimes, it will work the first time, then timeout the second. Occasionally a capture will work again after several failures. But, the first scenario described above is by far the most common sequence.

Any idea what might be causing this? Suggestions of what else to try?

-Jesse

@jjxtra

This comment has been minimized.

Copy link
Contributor

commented Apr 27, 2014

Rollback your gpuimage to before all the frame buffer changes. Then still capture should work fine.

  • Jeff

On Apr 27, 2014, at 5:36 PM, henduck notifications@github.com wrote:

I posted this previously on Issue #1474 but that issue is already closed, and I realized this thread is actually the more appropriate place for it. Basically, my problem is that imageFromCurrentFramebufferWithOrientation is behaving a bit erratically.

My capture is from the video camera, using the following init:

videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront];

videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
videoCamera.horizontallyMirrorFrontFacingCamera = YES;
videoCamera.horizontallyMirrorRearFacingCamera = NO;

grayscale = [GPUImageGrayscaleFilter new];
[videoCamera addTarget:grayscale];

[videoCamera startCameraCapture];

[grayscale addTarget:imgView1];
This all seems to work fine... I can see the filtered output on the screen. And can even capture video without any problems. The issue is, when I try to capture a still frame:

[grayscale useNextFrameForImageCapture];
UIImage *img = [grayscale imageFromCurrentFramebufferWithOrientation:UIImageOrientationUp];
if(img != nil)
NSLog(@“img was captured. Do something with it…”);
else
NSLog(@“img is nil”);
The most common scenario is that the very first time it tries to capture, it gets the 3 second timeout (line 188 in GPUImageFilter.m). Then it typically works fine on the second attempt (returns a proper capture). All subsequent attempts then return nil for the call to [grayscale imageFromCurrentFramebufferWithOrientation:UIImageOrientationUp]; Of course, it doesn't happen exactly this way every time. Sometimes, it will work the first time, then timeout the second. Occasionally a capture will work again after several failures. But, the first scenario described above is by far the most common sequence.

Any idea what might be causing this? Suggestions of what else to try?

-Jesse


Reply to this email directly or view it on GitHub.

@3DTOPO

This comment has been minimized.

Copy link
Author

commented Apr 27, 2014

On Apr 27, 2014, at 5:40 PM, Jeff Johnson notifications@github.com wrote:

Rollback your gpuimage to before all the frame buffer changes. Then still capture should work fine.

Is there a tag or version you would recommend? Thanks!

-jeshua

@jjxtra

This comment has been minimized.

Copy link
Contributor

commented Apr 27, 2014

April 2 is the last commit that is working for me, you could try the last
commit on that day.

-- Jeff

On Sun, Apr 27, 2014 at 5:48 PM, Jeshua Lacock notifications@github.comwrote:

On Apr 27, 2014, at 5:40 PM, Jeff Johnson notifications@github.com
wrote:

Rollback your gpuimage to before all the frame buffer changes. Then
still capture should work fine.

Is there a tag or version you would recommend? Thanks!

-jeshua


Reply to this email directly or view it on GitHubhttps://github.com//issues/1522#issuecomment-41513529
.

@BradLarson

This comment has been minimized.

Copy link
Owner

commented Apr 28, 2014

I know, I know, I'm looking into it. Something with video sources is not behaving in the way that I thought it should when grabbing images mid-stream. I swear this was reliable in testing, but I may have made a bad assumption here.

Unfortunately, I've been on the road for the last week or so, and have had work-related matters to attend to over the last month, so I've not had the time to fix this yet. I'll see what I can do this week now that I'm back in town.

@3DTOPO

This comment has been minimized.

Copy link
Author

commented May 16, 2014

At the risk of being a pest I wanted to check in and see if there may have been any movement. I am willing to help too if there is anything I can do to assit with tracking down the bug.

As of now I guess I am reverting back to an older version.

@soj

This comment has been minimized.

Copy link

commented Jun 2, 2014

same here. Trying to get capture every 5 seconds from camera.

    filter = [[GPUImageAdaptiveThresholdFilter alloc] init];
    filter.blurRadiusInPixels = 10.0;

    videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
    videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
    videoCamera.horizontallyMirrorFrontFacingCamera = NO;
    videoCamera.horizontallyMirrorRearFacingCamera = NO;

    [videoCamera.inputCamera lockForConfiguration:nil];
    [videoCamera.inputCamera setActiveVideoMaxFrameDuration:CMTimeMake(1, 10)];
    [videoCamera.inputCamera setActiveVideoMinFrameDuration:CMTimeMake(1, 10)];
    [videoCamera.inputCamera setAutoFocusRangeRestriction:AVCaptureAutoFocusRangeRestrictionNear];
    [videoCamera.inputCamera unlockForConfiguration];

    [videoCamera addTarget:filter];

    GPUImageView *filterView = (GPUImageView *)self.view;
    [filter addTarget:filterView];

    [videoCamera startCameraCapture];

    [NSTimer scheduledTimerWithTimeInterval:5.0 target:self selector:@selector(stillCameraCapture) userInfo:nil repeats:YES];

-----------

-(void)stillCameraCapture
{
    [videoCamera pauseCameraCapture];
    [filter useNextFrameForImageCapture];
    UIImage *capturedImage = [filter imageFromCurrentFramebufferWithOrientation:UIImageOrientationUp];
    NSLog(@"%@",capturedImage);
    UIImageWriteToSavedPhotosAlbum(capturedImage, nil, nil, nil);
    [videoCamera resumeCameraCapture];
}

so the output is:

2014-06-02 14:08:48.896 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:08:50.908 SimpleVideoFilter[11049:60b] <UIImage: 0x668d40>
2014-06-02 14:08:55.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:00.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:05.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:10.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:15.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:20.894 SimpleVideoFilter[11049:60b] (null)

first image is nil, second is ok, others is nil

@dvj

This comment has been minimized.

Copy link

commented Jun 11, 2014

On my device, I'm getting nil on all frames.

Is there any work-around for this? Some way to capture images from the stream? I've tried with no luck.

@jmzorko

This comment has been minimized.

Copy link

commented Jun 15, 2014

I'm also finding this to be very unreliable - every once in a great while my iPhone 5S or 3GS will actually capture an image with the -imageWithCurrentFrameBuffer method, but the vast majority of times it's nil. Is there another more reliable way of capturing the occasional still frame image from live video recording? I also tried creating an UIImage or CGImageRef from the CMSampleBufferRef, but since it's YUV, I can't seem to use CGImageCreate().

@simpleshadow

This comment has been minimized.

Copy link

commented Jul 28, 2014

Similar to others here with v1.4, I'm getting a nil UIImage from capturePhotoAsImageProcessedUpToFilter on a GPUImageStillCamera. This seems to be a widespread problem for everyone. I'm going to upgrade to release v1.5 and see if I have any luck.

@simpleshadow

This comment has been minimized.

Copy link

commented Jul 28, 2014

So I was finally able to get capturePhotoAsImageProcessedUpToFilter working by reverting to v1.2. Every execution of capturePhotoAsImageProcessedUpToFilter reliably returns the UIIMage. 👍

This required that I add GPUImage 1.2 as a static framework to my project (Brad has a great step by step guide on how to do this on SO which is different from what's in the README here).

I'm not sure what kind of performance hit I'm taking by using v1.2, but v1.3+ are pretty much useless to me if I can't export the videocamera feed to stills reliably.

@nicpro85

This comment has been minimized.

Copy link

commented Aug 11, 2014

Same issue here the image is always nil. I'm trying to figure out why.

if (!self.picture) {
    UIImage * image = self.element.image;
    self.picture = [[GPUImagePicture alloc] initWithImage:image];
}
if (!self.brightnessFilter) {
    self.brightnessFilter = [[GPUImageBrightnessFilter alloc] init];
    [self.picture addTarget:self.brightnessFilter];
}
self.brightnessFilter.brightness = self.element.filter.brightness.floatValue;

[self.picture useNextFrameForImageCapture];
[self.picture processImage];
UIImage * outputImage = [self.picture imageFromCurrentFramebuffer];
@haihw

This comment has been minimized.

Copy link

commented Aug 25, 2014

Hi All,
Any update on this?
I'm facing the same problem when use v1.5 pod
Thanks

@MattFoley

This comment has been minimized.

Copy link
Contributor

commented Sep 7, 2014

I am unfortunately also experiencing this issue. Camera Input -> Gamma Filter -> GPUImageView. Sometimes it returns a UIImage and sometimes it returns nil.

@jjxtra

This comment has been minimized.

Copy link
Contributor

commented Sep 7, 2014

I would suggest rolling back to the April 2014 commit around the 21st. Beyond this commit, the framework has been very buggy and unstable for me.

@benmcginnis

This comment has been minimized.

Copy link

commented Sep 12, 2014

@jjxtra do you know what Tag that was? If you do, does your project also compile for x64? I'm working on updating because I'm on the 0.1.0 version and it doesn't compile for x64

@jjxtra

This comment has been minimized.

Copy link
Contributor

commented Sep 13, 2014

My last pull was April 21st I believe.

-- Jeff

On Fri, Sep 12, 2014 at 2:34 PM, Ben McGinnis notifications@github.com
wrote:

@jjxtra https://github.com/jjxtra do you know what Tag that was? If you
do, does your project also compile for x64? I'm working on updating because
I'm on the 0.1.0 version and it doesn't compile for x64


Reply to this email directly or view it on GitHub
#1522 (comment)
.

@zabumba69

This comment has been minimized.

Copy link

commented Sep 22, 2014

I'm also having problems.
Before I took the image of the camera by:

stillcamera = [[GPUImageStillCamera alloc] initWithSessionPreset: AVCaptureSessionPreset640x480 cameraPosition: AVCaptureDevicePositionFront];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
filter = [[GPUImageFilter alloc] init];
[stillcamera addTarget: filter];
UIImage * img = [filter imageFromCurrentlyProcessedOutput]; //imageFromCurrentlyProcessedOutput removed from new version.

I do not use capturePhotoAsJPEGProcessedUpToFilter because I do not want the shoot sound.

But now I'm unable to reproduce this in the new version:

[filter useNextFrameForImageCapture];
UIImage* img = [filter imageFromCurrentFramebuffer];

The img is nil

Please could someone help me with this?
Thank you so much!

@ShayDavidson

This comment has been minimized.

Copy link

commented Oct 20, 2014

I upgraded from v.0.1.2 to v0.1.6 (adding the required useNextFrameForImageCapture), and now I encounter the same issue when using the GPUImageSolidColorGenerator filter.

The returned image is always nil:

GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:image];
GPUImageSolidColorGenerator *result = [GPUImageSolidColorGenerator new];
[result forceProcessingAtSize:size];
[result setColorRed:color.red green:color.green blue:color.blue alpha:color.alpha];

[result useNextFrameForImageCapture];
[stillImageSource processImage];
return [result imageFromCurrentFramebufferWithOrientation:UIImageOrientationUp];

(Ignore the fact the stillImageSource is "overriden" by the filter, it's just for the sake of the example).

Even if the GPUImageSolidColorGenerator is in the middle of a filter pipeline, the returned result is nil.

@taberrr

This comment has been minimized.

Copy link

commented Nov 10, 2014

Seeing the same exact thing as @soj ...

2014-06-02 14:08:48.896 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:08:50.908 SimpleVideoFilter[11049:60b] <UIImage: 0x668d40>
2014-06-02 14:08:55.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:00.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:05.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:10.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:15.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:20.894 SimpleVideoFilter[11049:60b] (null)

The first capture fails, second one works, and any others fail. I should note this is on an iPad 2. Everything seems to work fine on my iPhone 6. Also tried a couple of commits @jjxtra mentioned, namely 264598908 and bb112cc, both have the same result. :(

Does anyone have any other workarounds/suggestions? Going to have to scrap GPUImage for this project but I reeeeally don't want to! Hehe.

@ericzoo

This comment has been minimized.

Copy link

commented Dec 4, 2014

Can confirm that we are also experiencing this problem.

@jjxtra

This comment has been minimized.

Copy link
Contributor

commented Dec 4, 2014

Use changeset 0cdf63d from April. It seems like much later than that and everything is very unstable and buggy.

@se0lus

This comment has been minimized.

Copy link

commented Jan 12, 2015

On version 0.1.6
The problem is still there when I make this post and I found it is in GPUImageTwoInputFilter.
Some Filters, like GPUImageSoftElegance, if you call

- (UIImage *)imageByFilteringImage:(UIImage *)imageToFilter;

multiple times, for example in a while loop, you will find that the filter only work at the first time, then it will timeout and return null from GPUImageFilter.m line 182

- (CGImageRef)newCGImageFromCurrentlyProcessedOutput
...
    if (dispatch_semaphore_wait(imageCaptureSemaphore, convertedTimeout) != 0)
    {
        return NULL;
    }

The reason of this timeout is cause by GPUImageTwoInputFilter.
GPUImageSoftElegance use a GPUImageLookupFilter, GPUImageLookupFilter is a subclass of GPUImageTwoInputFilter.

GPUImageTwoInputFilter will do the process when it receive two image.
GPUImageLookupFilter set the lookup image as the first image when it init, so when the filter chain have new image input at the first time, GPUImageLookupFilter got two input and works, when call the filters imageByFilteringImage again, on the second time, it will fail because GPUImageTwoInputFilter only got one image input.

After dig inside the code for hours and I got a workaround.
in GPUImagePicture.h and GPUImagePicture.m
add method:

- (void)processImageAtTime:(CMTime)time
- (BOOL)processImageAtTime:(CMTime)frameTime withCompletionHandler:(void (^)(void))completion

modify method:

- (BOOL)processImageWithCompletionHandler:(void (^)(void))completion

GPUImagePicture.m :

- (void)processImageAtTime:(CMTime)time{
    [self processImageAtTime:time withCompletionHandler:nil];
}

- (BOOL)processImageWithCompletionHandler:(void (^)(void))completion;
{
    return [self processImageAtTime:kCMTimeIndefinite withCompletionHandler:completion];
}

- (BOOL)processImageAtTime:(CMTime)frameTime withCompletionHandler:(void (^)(void))completion{
    hasProcessedImage = YES;

    //    dispatch_semaphore_wait(imageUpdateSemaphore, DISPATCH_TIME_FOREVER);

    if (dispatch_semaphore_wait(imageUpdateSemaphore, DISPATCH_TIME_NOW) != 0)
    {
        return NO;
    }

    runAsynchronouslyOnVideoProcessingQueue(^{
        for (id<GPUImageInput> currentTarget in targets)
        {
            NSInteger indexOfObject = [targets indexOfObject:currentTarget];
            NSInteger textureIndexOfTarget = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];

            [currentTarget setCurrentlyReceivingMonochromeInput:NO];
            [currentTarget setInputSize:pixelSizeOfImage atIndex:textureIndexOfTarget];
            [currentTarget setInputFramebuffer:outputFramebuffer atIndex:textureIndexOfTarget];
            [currentTarget newFrameReadyAtTime:frameTime atIndex:textureIndexOfTarget];
        }

        dispatch_semaphore_signal(imageUpdateSemaphore);

        if (completion != nil) {
            completion();
        }
    });

    return YES;
}

in GPUImageOutput.m line 295, change the method like this:

- (CGImageRef)newCGImageByFilteringCGImage:(CGImageRef)imageToFilter;
{
    GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithCGImage:imageToFilter];

    [self useNextFrameForImageCapture];
    [stillImageSource addTarget:(id<GPUImageInput>)self];
    //a workaround for two image filter
    [stillImageSource processImageAtTime:kCMTimeZero];

    CGImageRef processedImage = [self newCGImageFromCurrentlyProcessedOutput];

    [stillImageSource removeTarget:(id<GPUImageInput>)self];
    return processedImage;
}

I dont think it is a good way but hope this will help ^_^;

@MrMatthewDavis

This comment has been minimized.

Copy link

commented Feb 13, 2015

Just adding that I'm still having this issue on the latest version. Is there any hope that this will be fixed at any point soon? Is there another way to reliably capture a still image from the camera feed and then crop it to a square?

EDIT: I tried using @se0lus's suggestion and when I added the line [self processImageAtTime:time withCompletionHandler:nil]; it couldn't find the processImageAtTime:withCompletionHandler: method. What commit version were you using?

@se0lus

This comment has been minimized.

Copy link

commented Feb 15, 2015

@mad102190

Sorry I missed something, my code is base on 0.1.6, the processImageAtTime:withCompletionHandler: was add to the code and I updated my post.

@wesbillman

This comment has been minimized.

Copy link

commented Feb 16, 2015

Another option is to modify the usage rather than the library. It seems like processImage does not wait for completion before returning, so it's possible to move on before the image is ready. I changed my usage to this (swift).

currentFilter.useNextFrameForImageCapture()
let sema = dispatch_semaphore_create(0)
stillImageSource.processImageWithCompletionHandler({
    dispatch_semaphore_signal(sema)
    return
})

dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER)
return currentFilter.imageFromCurrentFramebufferWithOrientation(image.imageOrientation)
@rromanchuk

This comment has been minimized.

Copy link
Contributor

commented Feb 28, 2015

The only way i got this working was as following

defaultFilter.useNextFrameForImageCapture()
let image = UIImage(CGImage: defaultFilter.newCGImageFromCurrentlyProcessedOutput().takeUnretainedValue())

The following recommended way by @BradLarson has never worked for me

defaultFilter.useNextFrameForImageCapture()
let image = defaultFilter.imageFromCurrentFramebuffer()

And for further context

lazy var defaultFilter: GPUImageBrightnessFilter = {
        return GPUImageBrightnessFilter()
    }()

lazy var videoSession: GPUImageVideoCamera = {
        let _videoSession = GPUImageVideoCamera(sessionPreset: AVCaptureSessionPresetHigh, cameraPosition: .Back)
        _videoSession.outputImageOrientation = .Portrait
        _videoSession.horizontallyMirrorFrontFacingCamera = true
        return _videoSession
    }()
@Bayonetta

This comment has been minimized.

Copy link

commented Jun 19, 2015

I have same experience with that

Seeing the same exact thing as @soj ...

2014-06-02 14:08:48.896 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:08:50.908 SimpleVideoFilter[11049:60b] <UIImage: 0x668d40>
2014-06-02 14:08:55.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:00.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:05.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:10.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:15.894 SimpleVideoFilter[11049:60b] (null)
2014-06-02 14:09:20.894 SimpleVideoFilter[11049:60b] (null)
The first capture fails, second one works, and any others fail. I should note this is on an iPad 2. Everything seems to work fine on my iPhone 6. Also tried a couple of commits @jjxtra mentioned, namely 264598908 and bb112cc, both have the same result. :(

Does anyone have any other workarounds/suggestions? Going to have to scrap GPUImage for this project but I reeeeally don't want to! Hehe.

And I found it just happened on iOS7, when tested on iOS8 it works well.

@develop24

This comment has been minimized.

Copy link

commented May 14, 2016

Any one found how it works, because I also stuck with the same issue.Im working with live video and the saved imaged is returns UIImage: 0x14e24a980, {640,1038}.It's not video its in still image. Please help me.

eligat pushed a commit to eligat/gpuimage-filters-editor that referenced this issue Aug 11, 2017

@canpoyrazoglu

This comment has been minimized.

Copy link

commented Oct 15, 2017

I'm having the same issue with the latest version. Did anyone find a reliable solution?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
You can’t perform that action at this time.