Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible get processed frames from RenderView as CVPixelBuffer (CVBuffer)? #218

Closed
cagkanciloglu opened this issue Dec 3, 2017 · 7 comments

Comments

@cagkanciloglu
Copy link

Hey,
I am trying to use processed frames shown on RenderView in Core ML. For that CVPixelBuffer is required. How can this be archived?
By the way CVPixelBuffer is an actually a typealias of CVImageBuffer which is also a typeallias of CVBuffer. So these types work as well.

I also checked RawDataOutput but I want to show these processed frames on screen as well

@BradLarson
Copy link
Owner

Yes, you can use a raw data output to capture the bytes at the step before they are displayed by the RenderView, and then create a pixel buffer from those.

@cagkanciloglu
Copy link
Author

cagkanciloglu commented Dec 4, 2017

Thanks for the reply @BradLarson
The thing I dont understand is that RawDataOutput is an ImageConsumer because of that I cant do something like this for example
camera --> filter --> rawdataoutput --> renderview

Only way I can use renderview if I use like this
camera --> filter --> renderview

where can i add rawdataoutput in this setup?

@BradLarson
Copy link
Owner

You can add it in parallel:

camera --> filter --> renderview
filter --> rawdataoutput

and the filter will output to both the raw data output and the RenderView.

@cagkanciloglu
Copy link
Author

cagkanciloglu commented Dec 4, 2017

hmm but will this effect the performance ? isnt this like processing frames twice or not?

@BradLarson
Copy link
Owner

No, the only thing that will impact performance is the actual extraction of the bytes. The filter only runs once and provides the same texture to both the RenderView and the raw data output. You're going to need to get the bytes out some way, and the raw data output is how you do that.

I don't believe I yet have my fast path for grabbing these bytes from the raw data output in this version of the framework, but if you need that you can examine what I did in the Objective-C one.

@cagkanciloglu
Copy link
Author

Will do that for sure.
Thank you so much for your help @BradLarson

@Briahas
Copy link

Briahas commented Apr 17, 2018

Hi @BradLarson,
I'm trying to connect GPUImage with GVRKit. And for that I need an CVPixelBuffer as output, from GPUImage, to be an input to GVRKit.
Right now I'm doing it straight forward (from comment )
filter --> rawoutput then dataAvailableCallback then Data.withUnsafeBytes then CVPixelBufferCreateWithBytes

But, as soon as I add rawoutput as additional output - it drastically reduce performance: from 20fps to 15 fps on iPhone6Plus. It doesnt depend on dataAvailableCallback operations.

Is it possible to avoid such behavior? Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants