-
-
Notifications
You must be signed in to change notification settings - Fork 64
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MetalANGLE.framework first impressions #16
Comments
🤦 I forgot to setup the MGLContext first! I'm trying to use
|
After adapting to the API differences above, I was able to run my app and everything renders as expected. Very nicely done! One of the reasons I'm looking into MetalANGLE is because Apple's GLES is stuck at 3.0 and doesn't support |
Thanks for the issue report, Yes, The missing EAGL's equivalent APIs seem to be simple to implement. I will add them later. So you managed to make your project work without these APIs? That's great to know. It's possible to implement |
Which MGLKit header can I use for this? Before I only had
Yes it would be enough for me. Can you point me where in the code I could get started trying to add these new formats? |
|
Great, I will look forward to it. If you are busy then I can try to implement it also. But in that case I need some pointers to get familiar with where MGL defines formats and how/where I can start adding the 16bit versions. |
How do you intend to use 16 bits formats? Use them for OpenGL textures or IOS view layer/default framebuffer? |
Ops, I missed your comment, if you want to implement it, you can take a look at this commit https://chromium.googlesource.com/angle/angle/+/25ab4510787f247ca364a052f7b3389ed7311d7a.
This json file is used to generate metal format conversion code. Every time it is modified, the script It would be great if this extension implementation could be tested in your project. |
Thank you for those pointers!
I am using libmpv to render video. It uses OpenGL textures to upload the video frame planes into, and then shaders to render into frame buffer for playback. For HDR videos, each color is 10bit so it's not possible to render the colors correctly with only 8bit textures for processing. |
Actually, GLES3 has 10 bits RGB format (alpha is 2 bits) - |
I tried your suggestion of adding the missing formats to the json file, and it works just as expected! (It took me longer to setup python2, and depot_tools etc so I could run the codegen script :) Right away I see the extension is being advertised, and libmpv uses it:
I cannot believe how easy it was. Really appreciate you guiding me through this process, and doing the hard work of figuring out what needed to be changed and where. What is the best way for me to contribute these changes? Can I send a PR here, or do I need to send a CL upstream? I have signed the Google CLA already. Since now the correct texture format is being used, next I need to find a way to set the colorspace on the underlying metal layer so that the colors are shown correctly on the display. For example on https://developer.apple.com/documentation/metal/drawable_objects/displaying_hdr_content_in_a_metal_layer/using_color_spaces_to_display_hdr_content?language=objc:
Is there any way for me to reach in and get the underlying Another improvement I would like to figure out: hardware videotoolbox decoding interop. Currently, with GLKit, I can take a Instead, there is another function I would really appreciate any thoughts you have on approaches and implementation here. Thanks again! |
I saw this in
But still I don't see It seems to go down this route we would add another But on https://www.khronos.org/registry/EGL/extensions/KHR/EGL_KHR_gl_colorspace.txt it talks about |
Another idea would be to get an IOSurface from the CVPixelBuffer using CVPixelBufferGetIOSurface. Then maybe that can be passed in using the existing EGL_ANGLE_iosurface_client_buffer? To convert the IOSurface to a metal texture, perhaps https://developer.apple.com/documentation/metal/mtldevice/1433378-newtexturewithdescriptor can be used. I'm not really sure what CVMetalTextureCache does so it may be more complicated than that. |
I see also that you have a TODO for external image support: metalangle/src/libANGLE/renderer/metal/DisplayMtl.mm Lines 572 to 573 in aaa371f
Is this something that could be used to sample from an external metal texture reference? |
It looks like MGLKit is indeed responsible for metalangle/ios/xcode/MGLKit/MGLLayer.mm Line 754 in b3b8f45
And that's how the color space is passed in currently: metalangle/ios/xcode/MGLKit/MGLLayer.mm Lines 738 to 739 in b3b8f45
So it seems the best solution may be to add more |
|
Update: Seems like |
Update v2: I just realized that IOSurface was disabled in tvOS in recent commit 1964ec0. This was due to someone reported that IOSurface is private API in pre tvOS 13.0. Since I support tvOS 11.0+ by default, the easiest way is just disable it. If you want to use IOSurface, perhaps special Xcode targets |
Thank you! I just started to figure out that I think I understand all the pieces required for my goal, so I will be working on implementing it together this week and will be sending you some PRs. |
FYI, today I built MetalANGLE in
It seems related to my use of GLContext sharegroups across threads. I noticed MGLContext is using TLS so maybe that is having a bad interaction with my threaded usage. I plan to investigate further later and make a repro or fix PR. |
I have just received some more requests on supporting importing external textures to MetalANGLE recently. Besides If this new extension was to be implemented, another new extension would need to be implemented also. i.e. something similar to EGL_ANGLE_device_d3d, in order to query the metal device used by MetalANGLE so that the external textures could be created from the same device. |
I’m trying to implement a mechanism for importing external texture to MetalANGLE. However when importing a texture I need to know the format of it. Do you have a list of formats that you are currently using with |
I have been playing with For GLES variant the common formats are documented:
For 10bit formats, I think the only way is using GL_RGBA16F and GL_HALF_FLOAT_OES |
How do you mean?
One thing which is not documented: in GLES2 mode you can use GL_RED or GL_RG for a U or UV plane. But if you use GLES3, it starts to fail. The workaround is to use GL_LUMINANCE and GL_LUMINANCE_ALPHA See http://stackoverflow.com/q/36213994/332798 and https://stackoverflow.com/a/8653891/332798 |
Regarding YpCbCr, this is helpful context: https://developer.apple.com/documentation/accelerate/conversion/understanding_ypcbcr_image_formats For these video frames, they are backed by IOSurface with multiple data planes. The CVPixelBuffer wraps the IOSurface, and has flags which allow import/export to either Metal or GLES. This is detailed in https://developer.apple.com/documentation/metal/mixing_metal_and_opengl_rendering_in_a_view Here is an example of
|
I meant if you use direct YUV422 format in metal then it is not supported yet. For example, there is a format |
Hm, somehow I never saw this on the documentation before. Maybe it's new. Sounds interesting for some use-cases, but I think most applications will still prefer mapping the underlying planes directly rather than repack/resample.
|
Also, there is yuv pixel formats in metal for example: https://developer.apple.com/documentation/metal/mtlpixelformat/gbgr422 |
I have added a new extension
This is just one of the way metal texture can be imported. There are other ways such as implement a new target type for |
This is the extension specification's draft https://github.com/kakashidinho/metalangle/blob/b5b41eecf1ea8ae4e416e429ccbd5991d50c71e2/extensions/EGL_MGL_texture_client_buffer.txt |
Thanks for mention this wonderful framework. But I meet empty screen issue, seems I missing some MetalANGLE setup. demo: mpv shows error: Replace GLKView with MGLKView. #import <MetalANGLE/MGLKit.h>
#import <MetalANGLE/MGLContext.h>
#import <MetalANGLE/MGLKView.h>
#import <MetalANGLE/GLES2/gl2.h>
static void *get_proc_address(void *ctx, const char *name)
{
CFStringRef symbolName = CFStringCreateWithCString(kCFAllocatorDefault, name, kCFStringEncodingASCII);
void *addr = CFBundleGetFunctionPointerForName(CFBundleGetBundleWithIdentifier(CFSTR("com.google.OpenGLES")), symbolName);
CFRelease(symbolName);
NSLog(@"get_proc_address %s => %p", name, addr);
return addr;
}
@interface MpvClientOGLView : MGLKView
@property mpv_opengl_cb_context *mpvGL;
@end
@implementation MpvClientOGLView {
GLint defaultFBO;
}
- (void)awakeFromNib
{
[super awakeFromNib];
self.context = [[MGLContext alloc] initWithAPI:kMGLRenderingAPIOpenGLES2];
if (!self.context) {
NSLog(@"Failed to initialize OpenGLES 3.0 context");
}
[MGLContext setCurrentContext:self.context];
// Configure renderbuffers created by the view
self.drawableColorFormat = MGLDrawableColorFormatRGBA8888;
self.drawableDepthFormat = MGLDrawableDepthFormatNone;
self.drawableStencilFormat = MGLDrawableStencilFormatNone;
defaultFBO = -1;
}
- (void)fillBlack
{
glClearColor(0, 0, 0, 0);
glClear(GL_COLOR_BUFFER_BIT);
}
- (void)drawRect
{
if (defaultFBO == -1)
{
GLint i = 0;
glGetIntegerv(GL_FRAMEBUFFER_BINDING, &i);
defaultFBO = (i != 0) ? i : 1;
}
if (self.mpvGL)
{
mpv_opengl_cb_draw(self.mpvGL,
defaultFBO,
self.bounds.size.width * self.contentScaleFactor,
-self.bounds.size.height * self.contentScaleFactor);
}
}
- (void)drawRect:(CGRect)rect
{
[self drawRect];
}
@end |
Just realize that I need rewrite video/out/opengl/hwdec_ios.m and link metaangle to build libmpv. |
Unfortunately it cannot work with hwdec=videotoolbox yet so you must use hwdec=videotoolbox-copy |
I thought in Big Sur they added OpenGLES support. But maybe it only works for unmodified iOS app and not available to macCatalyst https://twitter.com/stroughtonsmith/status/1286071942118879233?s=21 |
yes OpenGL runtime is there on Big Sur for macCatalyst, but not working on compiling time for intel/arm macCatalyst. It's now private APIs. |
Is it because of metal texture interop? |
There are some missing APIs mpv videotoolbox used.
|
Hi @kakashidinho, thanks so much for your work on this project!
I'm trying to replace openGL with MetalANGLE on a tvOS project. First I tried simply to import MetalANGLE.framework into my project. It kept throwing "image not found" errors, until I realized I need to embed/codesign the framework into the product.
Once I was able to start my app, I changed my
getProcAddr
to usecom.google.OpenGLES
Now I can call for example
glGetProcAddr("glGetString")
and get a valid address. I checked in the debugger, and confirmed I'm getting an address from inside MetalANGLE:Next I tried to run
glGetString(GL_VERSION)
, but I only get backNULL
. Same thing withGL_EXTENSIONS
. What am I doing wrong?The text was updated successfully, but these errors were encountered: