-
Notifications
You must be signed in to change notification settings - Fork 240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support YUV CVPixelBuffer output #11
Comments
Output to CVPixelBuffer at the end of the chain, would be great. So we can use that output for other stuff like making videos or publishing to a live stream or something else. |
MTIContext object supports rendering a MTIImage to a CVPixelBuffer. However it can only render to a BGRA pixel buffer for now. |
Will consider the "render to YUV" feature. |
i see CoreImage support render to yuv CVPixelBuffer. but it cost too much cpu. more than 100% cpu will be used.
is there any way to reduce the cpu use? |
it seems it is because of https://github.com/YuAo/YUCIHighPassSkinSmoothing, move the discussion to YuAo/YUCIHighPassSkinSmoothing#11 |
@omarojo @notedit Render to YUV pixel buffer is implemented on supported hardware (MTLFeatureSet_iOS_GPUFamily3_v1). ref: 2226c0f Not fully tested, use with caution. Here's an example:
|
When use none BGRA color format(for example nv12 or i420), it is boring to to color format convert. Hope MetalPetal can support this
The text was updated successfully, but these errors were encountered: