GPUImageRawDataInput and GPUImageMovie with GPUImageChromaKeyBlendFilter #1335

Open
aaronpeterson opened this Issue Dec 11, 2013 · 10 comments

Comments

Projects
None yet
3 participants

Is it currently possible to blend a chroma keyed movie with raw input? I'm wondering if there are syncing errors with the rate I send frames with GPUImageRawDataInput updateDataFromBytes:size: versus the green screen movie. You can see below that I've tried testing other filters on GPUImageRawDataInput and the results are filtered as expected. However, it appears that with raw input + movie the chroma key blend filter renders nothing.

    self.rawDataInput = [[GPUImageRawDataInput alloc] initWithBytes:[self.rawDataOutputFilter rawBytesForImage] size:self.frameSize];
    // works as expected:
    //GPUImageSepiaFilter *sepiaFilter = [[GPUImageSepiaFilter alloc] init];
    //[self.rawDataInput addTarget:sepiaFilter];
    //[sepiaFilter addTarget:self.gpuImageView];

    NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:@"ChromaKey640x480_01" withExtension:@"mov"];
    self.chromaKeyMovie = [[GPUImageMovie alloc] initWithURL:sampleURL];
    self.chromaKeyMovie.runBenchmark = NO;
    self.chromaKeyMovie.playAtActualSpeed = YES;

    self.chromaKeyBlendFilter = [[GPUImageChromaKeyBlendFilter alloc] init];
    [self.chromaKeyBlendFilter setColorToReplaceRed:0.0 green:1.0 blue:0.0];

    [self.chromaKeyMovie addTarget:self.chromaKeyBlendFilter];
    [self.rawDataInput addTarget:self.chromaKeyBlendFilter];
    [self.chromaKeyBlendFilter addTarget:self.gpuImageView];

I also tested the movie by simply pushing it to GPUImageView directly without GPUImageChromaKeyBlendFilter and it plays as expected (green and all, of course).

I am attempting to overlay chroma key video over a loop of frames that are sent to GPUImageView with GPUImageRawDataInput in this hokey and embarrassing, albeit functional, way:

- (void)sendFrameBufferToCameraPreview {
    [self.gpuImageView setInputRotation:kGPUImageRotateRight atIndex:0];
    self.loopTimer = [NSTimer scheduledTimerWithTimeInterval:0.033
                                     target:self
                                   selector:@selector(sendNextFrameToCameraPreview:)
                                   userInfo:nil
                                    repeats:YES];
    // toying around here I was able to get the first frame to show in the GPUImageView
    double delayInSeconds = 0.3;
    dispatch_time_t stopTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC);
    dispatch_after(stopTime, dispatch_get_main_queue(), ^(void) {
        [self.chromaKeyMovie startProcessing];
    });
}

- (void)sendNextFrameToCameraPreview:(NSTimer *)timer {
    [self.rawDataInput updateDataFromBytes:(GLubyte *)[[self.frameRingBuffer getNextFrame] bytes] size:self.frameSize];
    [self.rawDataInput processData];
    [self.rawDataInput notifyTargetsAboutNewOutputTexture];
}

Should this be expected to work or am I asking for too much without digging into lower level code?

GPUImageChromaKeyFilter doesn't seem to work with a video source either. Just tried this in the SampleVideoFilter example:

    // Testing GPUImageChromaKeyFilter
    GPUImageView *filterView = (GPUImageView *)self.view;

    // Testing GPUImageChromaKeyFilter
    NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:@"sample" withExtension:@"mp4"];
    chromaKeyMovie = [[GPUImageMovie alloc] initWithURL:sampleURL];
    chromaKeyMovie.runBenchmark = YES;
    chromaKeyMovie.playAtActualSpeed = YES;

    chromaKeyFilter = [[GPUImageChromaKeyFilter alloc] init];
    [(GPUImageChromaKeyFilter *)chromaKeyFilter setColorToReplaceRed:0.0 green:1.0 blue:0.0];
    [(GPUImageChromaKeyFilter *)chromaKeyFilter setThresholdSensitivity:0.3];
    [chromaKeyMovie addTarget:chromaKeyFilter];

    [chromaKeyFilter addTarget:filterView];

    [chromaKeyMovie startProcessing];

The runBenchmark logs about 1.6ms per frame, however only the first frame of the video renders and stays visible on the GPUImageView (and the frame that shows is unprocessed, green). The video file is here http://www.filedropper.com/sample_10

Owner

BradLarson commented Dec 13, 2013

Well, the GPUImageChromaKeyFilter will only turn matching areas of the image transparent. Currently, GPUImageView doesn't support transparency, so it won't pass through anything that's behind the view.

Because it doesn't support transparency, if you sent content that has transparency to the GPUImageView, you may see odd visual artifacts. I have to check to see if I clear the buffer properly before rendering or not.

The blend is preferable if you're rendering to a screen or to a movie file, because that makes sure that the keyed content is overlaid properly on what your background image or video.

For the raw data input, I'm not sure that I'm propagating the timestamps from that correctly. You might check my code on that source, and how it's blended in the two input filter abstract class.

I can verify that the chroma key filter does work, as well as the blend, and you can see both of them in action in the FilterShowcase example.

I cannot currently get anything pixels converted to alpha with GPUImageChromaKeyFilter in video. I wasn't sure what I'd see behind the movie I have with tons of green pixels filtered through GPUImageChromaKeyFilter but I figured it wasn't green (It's just not working on the movie I have).

I'm assuming there is an issue with codecs. I will play more with this. Which codec is best for GPUImageMovie?

I will look at filter showcase again. Thanks a ton for your response.

Should GPUImageChromaKeyFilter be expected to work with GPUImageMovie as well?

Owner

BradLarson commented Dec 13, 2013

The filters don't know anything about codecs. They just take in an image, process it, and pass it out. Filters should work equally well with movies, live video, and still images. I know that I've used the chroma key blend with movie sources.

Again, remember that the chroma key filter will output pixels with a non-1.0 alpha channel. GPUImageView does not display these pixels properly (with compositing), so if you want to display something to the screen you'll need to use the chroma key blend, not the plain chroma key filter.

Thank you Brad. Just looking through Filter Showcase now and noticing that the non-blend chroma key example also included GPUImageAlphaBlendFilter. With either of these methods I cannot seem to display chroma keyed green screen over another video source. There appears to be some sort of sync issue. The issue was not with the codecs, indeed. If I blend a chroma keyed video over a still image, it works great, however. Is there a way to tap into some sort of clock to line all these frames up?

Owner

BradLarson commented Dec 13, 2013

The frames themselves have timestamps, but right now I don't do anything to synchronize them. I just take frames as they come from the source and blend them in the two-input filters. The two-input filters need to have some mechanism for synchronizing these, but I haven't implemented one yet.

Thanks again. I made it so far in my proof of concept I can't turn back! I'll see if I can sync up the two input filter inputs somehow and report my findings if I have any luck.

Is transparent/alpha channel video support on the roadmap or is on the backburner?

Owner

BradLarson commented Dec 18, 2013

@zakdances - That would require there to be any kind of roadmap at all. Frankly, I add things as I need them and find the time to do them, or as people contribute pull requests. You're welcome to try adding this yourself and sending in a pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment