Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support YUV CVPixelBuffer output #11

Closed
notedit opened this issue Aug 7, 2017 · 6 comments
Closed

Support YUV CVPixelBuffer output #11

notedit opened this issue Aug 7, 2017 · 6 comments

Comments

@notedit
Copy link

notedit commented Aug 7, 2017

When use none BGRA color format(for example nv12 or i420), it is boring to to color format convert. Hope MetalPetal can support this

@omarojo
Copy link

omarojo commented Aug 7, 2017

Output to CVPixelBuffer at the end of the chain, would be great. So we can use that output for other stuff like making videos or publishing to a live stream or something else.

@YuAo
Copy link
Member

YuAo commented Aug 7, 2017

MTIContext object supports rendering a MTIImage to a CVPixelBuffer. However it can only render to a BGRA pixel buffer for now.

@YuAo
Copy link
Member

YuAo commented Aug 7, 2017

Will consider the "render to YUV" feature.

@YuAo YuAo changed the title maybe support CVPixelBuffer output Support YUV CVPixelBuffer output Aug 7, 2017
@notedit
Copy link
Author

notedit commented Aug 13, 2017

i see CoreImage support render to yuv CVPixelBuffer. but it cost too much cpu. more than 100% cpu will be used.

  [ciContext render:outimage toCVPixelBuffer:outPixelBuffer];

is there any way to reduce the cpu use?

@notedit
Copy link
Author

notedit commented Aug 13, 2017

it seems it is because of https://github.com/YuAo/YUCIHighPassSkinSmoothing, move the discussion to YuAo/YUCIHighPassSkinSmoothing#11

@YuAo
Copy link
Member

YuAo commented Jan 9, 2018

@omarojo @notedit Render to YUV pixel buffer is implemented on supported hardware (MTLFeatureSet_iOS_GPUFamily3_v1). ref: 2226c0f

Not fully tested, use with caution.

Here's an example:

NSError *error;
CVPixelBufferRef pixelBuffer;
CVPixelBufferCreate(kCFAllocatorDefault, outputImage.size.width, outputImage.size.height, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, (__bridge CFDictionaryRef)(@{(id)kCVPixelBufferIOSurfacePropertiesKey: @{}}), &pixelBuffer);
[context renderImage:outputImage toCVPixelBuffer:pixelBuffer error:&error];
//use the pixelBuffer
//....
//release the pixelBuffer
CVPixelBufferRelease(pixelBuffer);

@YuAo YuAo closed this as completed Jan 9, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants