Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to apply a CIIFilter to preview and video? #18

Closed
jjaybrown opened this issue Nov 26, 2016 · 10 comments
Closed

How to apply a CIIFilter to preview and video? #18

jjaybrown opened this issue Nov 26, 2016 · 10 comments
Labels

Comments

@jjaybrown
Copy link

I've looked into it, but everything I find suggests taking the sample buffer and converting to a UIImage, which doesn't seem right, particularly when it would require updating the AVPreview layer each frame.

I'm currently looking at the VideoDelegate methods:

public func nextLevel(_ nextLevel: NextLevel, willProcessRawVideoSampleBuffer sampleBuffer: CMSampleBuffer) {}

public func nextLevel(_ nextLevel: NextLevel, renderToCustomContextWithImageBuffer imageBuffer: CVPixelBuffer, onQueue queue: DispatchQueue) {}

Any tips would be appreciated.

@piemonte
Copy link
Contributor

hey @Delta98 thanks for the interest. those are the correct delegate methods for frame processing. you'll have to set this flag to true for it to be enabled (perf reasons):

https://github.com/NextLevel/NextLevel/blob/master/Sources/NextLevel.swift#L594

after processing a frame, you set it to this property with the result:

https://github.com/NextLevel/NextLevel/blob/master/Sources/NextLevel.swift#L606-L617

i'm using this code path for one of my own projects, so i'll share more explanation/updates when i have the time.

@jjaybrown
Copy link
Author

Thanks @piemonte how are you applying the CIIFilter to imageBuffer? I've got the gist, it's just putting it together.

@piemonte
Copy link
Contributor

hey @Delta98 it's likely i won't be able to add a CIFilter to the sample project anytime soon, but if you need references on how to do this, check out this post, https://www.objc.io/issues/23-video/core-image-video/

I've added a few comments to NL as well. Essentially, you can modify the sampleBuffer provided by either of these delegates functions:

https://github.com/NextLevel/NextLevel/blob/master/Sources/NextLevel.swift#L495-L496

I may also open source some NL Metal and GLES based render views but that probably won't be until next year.

@bennnjamin
Copy link

bennnjamin commented Apr 25, 2017

@piemonte It is still not clear how to apply the same filter to a preview. willProcessRawVideoSampleBuffer seems suitable for analyzing the preview, but how can you set the preview buffer to your own image that has a filter applied? renderToCustomContextWithImageBuffer is only called when recording.

Edit: I've been trying to update the example code with what you linked in this post here https://www.objc.io/issues/23-video/core-image-video/ and can't get anything to work. The filter is not applied to the output.

func nextLevel(_ nextLevel: NextLevel, renderToCustomContextWithImageBuffer imageBuffer: CVPixelBuffer, onQueue queue: DispatchQueue) {
        let transform = CGAffineTransform(rotationAngle: -CGFloat(M_PI_2))
        transform.scaledBy(x: 1, y: -1)
        let input = CIImage(cvPixelBuffer: imageBuffer).applying(transform)
        
       let angle = Float(NSDate.timeIntervalSinceReferenceDate.truncatingRemainder(dividingBy: M_PI*2) )
        let parameters = [
            kCIInputAngleKey: angle,
            kCIInputImageKey: input
            ] as [String : Any]
        let filter = CIFilter(name: "CIHueAdjust",
                              withInputParameters: parameters)

        let filteredImage = filter?.outputImage
        nextLevel.videoCustomContextImageBuffer = filteredImage?.pixelBuffer
    }

@piemonte
Copy link
Contributor

hey @bennnjamin thanks for the project interest. unfortunately, preview layers aren't easily modified. in my apps i either use metal or OpenGL to render the buffers into a view and use a custom camera preview. http://pbj.vision has a good example of this with the effects controller.

I've had other folks request this too, will probably open source something later this year.

📎 #39
📎 #45 (comment)

@bennnjamin
Copy link

Thanks @piemonte I will open an issue with where I am stuck now

@otusweb
Copy link

otusweb commented Jun 13, 2017

@piemonte Any update on open sourcing an exemple, i'm working through this and am running into issues.

Specifically, I render the preview using an OpenGL context and it works fine, though once it is time to record, I have to process the image twice (once for the preview and once to modify the buffer. Is that expected or am I missing something?
Also the audio seems to have dropped once i process the image.

@piemonte
Copy link
Contributor

hey @otusweb i'm in the process of shipping some work right now so i unfortunately have no time. def recommend looking at how PBJVision does this in the short term. i'm not sure why it should be required to render the same effect twice. wish i could be of more help. best of luck.

@otusweb
Copy link

otusweb commented Jun 14, 2017

ok, thanks. Can you point me at what file/project I should look at in PBJVision, i briefly looked yesterday and did not find anything relevant. Thx

@fukemy
Copy link

fukemy commented Jun 20, 2023

@

@piemonte It is still not clear how to apply the same filter to a preview. willProcessRawVideoSampleBuffer seems suitable for analyzing the preview, but how can you set the preview buffer to your own image that has a filter applied? renderToCustomContextWithImageBuffer is only called when recording.

Edit: I've been trying to update the example code with what you linked in this post here https://www.objc.io/issues/23-video/core-image-video/ and can't get anything to work. The filter is not applied to the output.

func nextLevel(_ nextLevel: NextLevel, renderToCustomContextWithImageBuffer imageBuffer: CVPixelBuffer, onQueue queue: DispatchQueue) {
        let transform = CGAffineTransform(rotationAngle: -CGFloat(M_PI_2))
        transform.scaledBy(x: 1, y: -1)
        let input = CIImage(cvPixelBuffer: imageBuffer).applying(transform)
        
       let angle = Float(NSDate.timeIntervalSinceReferenceDate.truncatingRemainder(dividingBy: M_PI*2) )
        let parameters = [
            kCIInputAngleKey: angle,
            kCIInputImageKey: input
            ] as [String : Any]
        let filter = CIFilter(name: "CIHueAdjust",
                              withInputParameters: parameters)

        let filteredImage = filter?.outputImage
        nextLevel.videoCustomContextImageBuffer = filteredImage?.pixelBuffer
    }

filteredImage?.pixelBuffer always nil

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants