Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

video: Expose HDR metadata via window properties #9914

Merged
merged 1 commit into from
Jun 20, 2024

Conversation

Kontrabant
Copy link
Contributor

@Kontrabant Kontrabant commented May 28, 2024

  • Makes the HDR and colorspace properties per-window, in addition to per-display.
  • Passes through color primary points, white points, and min/max luminance levels, when available.
  • Adds the frog_color Wayland protocol to pass HDR metadata through to client applications.

Still needs work, but it's functional and should serve as a starting point.

Resolves #6587

@Kontrabant Kontrabant force-pushed the sdl_frog_color branch 4 times, most recently from 508e551 to 6a1e55e Compare May 28, 2024 21:24
Copy link
Collaborator

@flibitijibibo flibitijibibo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm - my only real thought is that it's a lot of properties that the client will almost certainly want to query all at once, but I dunno how easy it is to pack values with the current properties system and it's not a huge deal to have a big block o' queries to get everything we need in any case.

@slouken
Copy link
Collaborator

slouken commented May 29, 2024

Let's hold off on merging this, I'd like to see how the HDR demo needs/uses this information. testcolorspace and testffmpeg do HDR rendering with the existing API.

I think if we're going to expose all of these properties, we should probably make SDL_HDRProperties public and add an API function to set/get them for the window.

@Kontrabant
Copy link
Contributor Author

Let's hold off on merging this

No hurry, I just wanted to get something usable for the people working on projects that need the extra HDR metadata since it was bought up. This will definitely be iterated upon.

@Kontrabant Kontrabant force-pushed the sdl_frog_color branch 3 times, most recently from d32dacc to 3941f28 Compare June 2, 2024 19:06
@Kontrabant
Copy link
Contributor Author

Kontrabant commented Jun 2, 2024

Changed from properties to the SDL_HDRProperties struct being public, due to the number of parameters returned, and added SDL_GetDisplayHDRProperties() and SDL_GetWindowHDRProperties().

The documentation stresses that retrieving it from the display should only be used for informational purposes, such as logging the system configuration for troubleshooting or noting that a specific display is HDR capable in a selection list, but I'm open to completely removing it too, if desired.

The remaining tests were updated to use the per-window API, and testdisplayinfo now prints display color and luminance info in addition to the available modes.

Copy link
Collaborator

@flibitijibibo flibitijibibo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New version gets a 👍 from me - I'm actually planning to use HDRProperties for GPU's SetHDRMetadata so the current API works out perfectly for us.

@slouken
Copy link
Collaborator

slouken commented Jun 3, 2024

I'm still curious how this information will actually be consumed?

@flibitijibibo
Copy link
Collaborator

We're still barrelling towards GPU Alpha so we don't have the precise example, but MSDN does have documentation for using HDR metadata to do tonemapping with the information (though in this case they're using DXGI, we could bypass this in SDL's case):

https://learn.microsoft.com/en-us/windows/win32/direct3darticles/high-dynamic-range#adapt-hdr-content-to-the-displays-capabilities-using-tone-mapping

The main reason we aren't building tonemapping directly into the GPU swapchain API is this:

Tonemapping is inherently a lossy process, and can be optimized for a number of perceptual or objective metrics, so there's no single standard algorithm. Windows provides a built-in HDR tonemapper effect as part of Direct2D as well as in the Media Foundation HDR video playback pipeline. Some other commonly used algorithms include ACES Filmic, Reinhard, and ITU-R BT.2390-3 EETF (electrical-electrical transfer function).

Our current tonemap.c example does use many of those algorithms for translating between internal formats, but does not yet do the additional correction that translates mastering HDR metadata to that of the display.

@Kontrabant Kontrabant force-pushed the sdl_frog_color branch 3 times, most recently from 888873d to bd40c84 Compare June 4, 2024 13:29
Comment on lines 107 to 131
typedef struct SDL_HDRProperties
{
SDL_FPoint red_primary; /**< the red primary x/y coordinates */
SDL_FPoint green_primary; /**< the green primary x/y coordinates */
SDL_FPoint blue_primary; /**< the blue primary x/y coordinates */
SDL_FPoint white_point; /**< the white point x/y coordinates */
float min_luminance; /**< the min luminance level */
float max_luminance; /**< the max luminance level */
float max_full_frame_luminance; /**< the max full-frame luminance */
float SDR_white_level; /**< the SDR white level */
float HDR_headroom; /**< the additional dynamic range that can be displayed in terms of the SDR white level (1.0 if HDR is not enabled) */
} SDL_HDRProperties;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you already have min/max luminance and SDR white level then headroom is redundant. On the other hand, just the headroom is not sufficient to describe dynamic range in the blacks.

I suspect the primaries and min/max luminance are describing the target color volume (https://gitlab.freedesktop.org/pq/color-and-hdr/-/blob/main/doc/mastering_display_metadata.md) and not the min/max luminances of the encoding.

The min/max luminances of the encoding must be known to make sense of the SDR white level. PQ defines this but everything else does not, which means that this needs to be communicated.

The reference white level and min/max luminance of the encoding are requirements to properly anchor content and tone map content. The target color volume is merely an optimization to reduce the mapping distance.

Really none of that is depending on "HDR"ness.

Thus, what you probably should do, is this:

Suggested change
typedef struct SDL_HDRProperties
{
SDL_FPoint red_primary; /**< the red primary x/y coordinates */
SDL_FPoint green_primary; /**< the green primary x/y coordinates */
SDL_FPoint blue_primary; /**< the blue primary x/y coordinates */
SDL_FPoint white_point; /**< the white point x/y coordinates */
float min_luminance; /**< the min luminance level */
float max_luminance; /**< the max luminance level */
float max_full_frame_luminance; /**< the max full-frame luminance */
float SDR_white_level; /**< the SDR white level */
float HDR_headroom; /**< the additional dynamic range that can be displayed in terms of the SDR white level (1.0 if HDR is not enabled) */
} SDL_HDRProperties;
typedef struct SDL_TargetColorVolume
{
SDL_FPoint red_primary; /**< the red primary x/y coordinates */
SDL_FPoint green_primary; /**< the green primary x/y coordinates */
SDL_FPoint blue_primary; /**< the blue primary x/y coordinates */
SDL_FPoint white_point; /**< the white point x/y coordinates */
float min_luminance; /**< the min luminance level */
float max_luminance; /**< the max luminance level */
float max_full_frame_luminance; /**< the max full-frame luminance */
} SDL_TargetColorVolume;
typedef struct EncodingLuminances
{
float min_luminance; /**< the min luminance of the encoding in nits*/
float max_luminance; /**< the max luminance of the encoding in nits */
float reference_white_level; /**< the reference white level in nits */
} SDL_EncodingLuminances;

where SDL_EncodingLuminances is either required or implied, and SDL_TargetColorVolume is optional.

@Kontrabant
Copy link
Contributor Author

The color point and luminance level info is now optional, with a flag indicating whether or not the platform provides this information. On platforms like Mac, which doesn't expose color points or luminance levels, this is preferable to using some default values that are almost certainly wrong.

display->HDR.HDR_headroom = 1.0f;
}

SDL_SendDisplayEvent(display, SDL_EVENT_WINDOW_HDR_STATE_CHANGED, 0);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The HDR state change isn't a display event. We should either have a distinct event for displays or only send it to windows.

@@ -31,11 +31,32 @@ print_mode(const char *prefix, const SDL_DisplayMode *mode)
SDL_GetPixelFormatName(mode->format));
}

static void print_hdr(SDL_HDRProperties *props)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
static void print_hdr(SDL_HDRProperties *props)
static void print_HDR(SDL_HDRProperties *props)

It's an acronym

Comment on lines 129 to 130
float SDR_white_level; /**< the SDR white level */
float HDR_headroom; /**< the additional dynamic range that can be displayed in terms of the SDR white level (greater than 1.0 if the output is EDR/HDR capable */
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the difference between the white level and headroom? The headroom works with any encoding and without knowing min/max absolute levels, but the HDR metadata luminances require knowing either the absolute level of the white level or min/max absolute levels. The white level gives the headroom if one assumes black levels are always the same and if one knows the min/max absolute levels.

IOW, both require min/max luminance levels to work properly, but once they are available, they are equal (ignoring the black level issue).

So why have both values? Why not just

typedef struct SDL_HDRProperties
{
  SDL_bool has_color_data;
  SDL_HDRColorData color_data;
  float min_luminance;
  float max_luminance;
  float reference_white_level;
}

For the HDRColorData you're also not using headroom anyway...

Copy link
Contributor Author

@Kontrabant Kontrabant Jun 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apple platforms don't actually provide the color/white points or min/max luminance data for the window surface or display (if there is a way to get it without parsing the EDID, it's not obvious), just a singular ceiling for color values that a client application should clamp its values to, as macOS/iOS seem to insist on handling any tone mapping behind the scenes. In this case, the headroom will be set with that maximum value, but the color points and luminance values will be unset, since they are unknown.

The SDR white level is used by SDL's internal renderers, and Windows explicitly provides the value it wants used for this (there is a slider in the settings), so it needs to be passed through, as it doesn't necessarily correspond to, or can be derived from, any other reported values.

@Kontrabant
Copy link
Contributor Author

A couple of further refinements:

  • Renamed the struct to SDL_HDROutputProperties, to better reflect what it represents.
  • The event for the display HDR properties changing has been dropped.
  • The color and luminance struct is no longer declared outside of the properties struct, as it really has no separate purpose other than providing an enclosed namespace for optional parameters not exposed by every platform. A future SDL_SetWindowHDRMetadata function would need its own HDR metadata struct as the parameters required for the setting case are slightly different.
  • Added some reserved padding to expand the struct in the future if needed.

Marking as ready for review.

@Kontrabant Kontrabant marked this pull request as ready for review June 17, 2024 16:01
@slouken
Copy link
Collaborator

slouken commented Jun 17, 2024

I'm still looking for the real-world use case for this API?

@Kontrabant
Copy link
Contributor Author

I'll defer to Ethan for their specific needs regarding the GPU API, but in general, anything doing its own HDR tone mapping or color correction will need this information, and there is currently no cross-platform method for retrieving it (see #6587). Currently, anything using Vulkan* or GL has to punch through to the underlying windowing system to retrieve it, getting this information for lower level interfaces like kmsdrm would involve clients parsing EDID data themselves, and even on Windows when using DX, having a simple SDL function to accomplish this would be much more convenient than dealing with DXGI directly when SDL already has the backend code to do so.

To cite prior art, SDL has long provided a function for retrieving the per-window ICC profile information for similar reasons.

*A hint was dropped in the Wayland color protocol discussion that this functionality may be in progress for Vulkan via an extension, but there is nothing stated publicly beyond that.

@slouken
Copy link
Collaborator

slouken commented Jun 17, 2024

Well, Steam Link uses the HDR headroom to support HDR tone mapping on Windows and Apple platforms, and doesn't need the monitor colorimetry, since the captured content is always in the BT709 colorspace. I'm wondering what display system doesn't use BT709 or ST2084 for the content output colorspace?

@flibitijibibo
Copy link
Collaborator

Steam Link might actually be a good way of looking at it - as I've been futzing with HDR I've had some questions about streaming support and this feature falls pretty much right in line with them:

Let's say we've got a host PC and a client PC, both attached to HDR10-capable displays. The important detail is that the displays are different makes and models. Therefore, the RGB/brightness levels are all completely different values (i.e. the peak brightness of the client display is much higher, and the red/blue primary values do not match between host and client).

  • When an application sets metadata (via vkSetHdrMetadataEXT for example), does the streaming system track that data and pass it to the client PC, so that the client PC can then complete the delivery of the mastering metadata in the HDMI infoframe (or DP, whichever is applicable)?
  • When an application requests metadata, does the streaming system override the host's display data with the client's, so that the in-application tonemapping matches the display actually being viewed? Or does the client just have to live with whatever colorimetry the host has and find a way to manually tonemap, whether the application internally tonemaps or not (possibly resulting in two lossy conversions rather than just one)?

The colorspace may be the same but the metadata involved isn't, and if any blockers exist between the application and the final display output it would make automatic color correction virtually impossible, particularly for the streaming case.

@slouken
Copy link
Collaborator

slouken commented Jun 17, 2024

At least on Windows, the D3D swap chain is in sRGB, scRGB, or PQ colorspace, and Windows handles transforming those colorspaces into what the monitor can actually display, so you don't need to query the monitor colorspace at all. You do need the white level and relative brightness though, so you can map from the values on one system to the values on another system. There are a bunch of different tone mapping algorithms, but they all try to preserve color and brightness range as much as possible, and this is done assuming one of the colorspaces above. Steam Deck works the same way, gamescope looks at the colorspace on the D3D or Vulkan swap chain and maps that to the display capabilities of the screen.

In our experience, games rarely query or set the monitor colorspace and when they do it's often to incorrect values that are then ignored by the game content, which is all authored for sRGB or a specific HDR target. The game then uses shaders to do tone mapping if needed, but again, ignores colorspace aside from converting to/from BT2020 for HDR content.

I do understand the use case of video players, which have to handle many different color spaces and want the OS display to handle the conversion as efficiently as possible, but this is rarely used for games.

@Kontrabant
Copy link
Contributor Author

I looked at the KDE HDR implementation just now, and it just sends the default 709 or 2020 color points as well (depending on whether the display is HDR capable), so it looks like current Linux implementations aren't going to be sending anything outside of that, at least not yet.

If the colorspace points aren't necessary, I'll roll this back and just make the headroom and SDR white level properties per-window + add the Wayland support.

If something makes a solid case for needing the advanced color and luminance across platforms down the road, it can easily be added later since the backend framework is already in place, or maybe by then whatever new Vulkan extension is allegedly brewing will be ready, and exposing it via SDL won't be necessary.

@swick
Copy link

swick commented Jun 17, 2024

Please explain how SDR white level and headroom interact with PQ, scRGB and sRGB.

@slouken
Copy link
Collaborator

slouken commented Jun 18, 2024

For sRGB, SDR white level and headroom are always 1.0.

For scRGB, SDR white level and headroom will vary depending on the HDR settings and display capabilities, but are in units of 80 nits on Windows. A common setting would be a white level of 2.5 and headroom of 2.0 (white level of 200 nits and display max brightness of 400 nits) On Apple displays, the white level is always 1.0 and the headroom will change dynamically as ambient lighting changes.

The SDL renderer doesn't directly support PQ output, but conceptually it's the same, you'd use a white level in terms of nits and headroom in terms of that white level. In the hypothetical display above, you'd have a white level of 200 and headroom of 2.0, and you'd tone map content mastered at 100 nits diffuse white and 10000 nit max brightness (100.0 headroom) into that range.

@swick
Copy link

swick commented Jun 18, 2024

For sRGB, SDR white level and headroom are always 1.0.

Okay, so what happens if the window covers multiple displays?

For scRGB, SDR white level and headroom will vary depending on the HDR settings and display capabilities, but are in units of 80 nits on Windows. A common setting would be a white level of 2.5 and headroom of 2.0 (white level of 200 nits and display max brightness of 400 nits) On Apple displays, the white level is always 1.0 and the headroom will change dynamically as ambient lighting changes.

So, on scRGB the white level is a multiplier for (1,1,1) luminance and the headroom is a multiplier for the white level.

Now you're saying on apple white level is always at 1.0. Then how does one figure out the level that corresponds to max_luminance and min_luminance? I don't think that's possible.

The SDL renderer doesn't directly support PQ output, but conceptually it's the same, you'd use a white level in terms of nits and headroom in terms of that white level. In the hypothetical display above, you'd have a white level of 200 and headroom of 2.0, and you'd tone map content mastered at 100 nits diffuse white and 10000 nit max brightness (100.0 headroom) into that range.

Does apple also always have a fixed white level in PQ content and the system remaps that to another level, similar to what it does for scRGB? I believe it does but I don't have a system to verify. If that's the case, then the same question as above applies: how does the app translate min/max_luminance and fll of the display to a specific level in PQ?

@slouken
Copy link
Collaborator

slouken commented Jun 18, 2024

Okay, so what happens if the window covers multiple displays?

The application renders using the colorspace associated with the swap chain, and the system converts the output to the range supported by each display.

Now you're saying on apple white level is always at 1.0. Then how does one figure out the level that corresponds to max_luminance and min_luminance? I don't think that's possible.

Correct.

Does apple also always have a fixed white level in PQ content and the system remaps that to another level, similar to what it does for scRGB? I believe it does but I don't have a system to verify. If that's the case, then the same question as above applies: how does the app translate min/max_luminance and fll of the display to a specific level in PQ?

min/max_luminance has no meaning in Apple's dynamic lighting model. There is no way to relate absolute brightness to Apple's displays. You should map the your content white level and headroom into the output white level and headroom, and be able to respond to dynamic changes in either.

You need to be able to do this on Windows too, your content will be mastered to a specific display, which may not match the user's monitor. If you have brightness ranges all the way up to 10000 nits, and your game is displayed on a 400 nit monitor, you'll either be clipped or need to tone map into the lower brightness range. The same is true for apple displays, and headroom is a convenient way to represent this that works on both systems.

@Kontrabant
Copy link
Contributor Author

Reworked the pull to just move the white level and headroom properties to the window and add Wayland support. The old patch with the color and luminance info is stashed away in another branch if needed.

Currently, the displays only expose a simple boolean property that indicates whether or not they are HDR capable. The white level and headroom properties were moved from the display to the window properties instead of being exposed on both, as I wasn't sure if exposing them on the display as well would lead to confusion, despite a note stating that the values should be queried from the window.

/* The reference white level seems to be 1.0 */
switch (transfer_function) {
case FROG_COLOR_MANAGED_SURFACE_TRANSFER_FUNCTION_ST2084_PQ:
case FROG_COLOR_MANAGED_SURFACE_TRANSFER_FUNCTION_SCRGB_LINEAR:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will be different between ST2084 and SCRGB. There aren't currently any renderers that support PQ output, but the "correct" value for PQ is white level = 100 (or 200, depending on who you ask), clamped to the max luminance, and headroom divided by that.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

203 nit is pretty much per spec (ITU-R BT.2408). How do people come up with 100 nit?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't have access to the spec, but the original ST2084 spec supposedly defined diffuse white as 100 nits:
https://lightillusion.com/what_is_hdr.html

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The latest ITU-R BT.2408-7 report:

The reference level, HDR Reference White, is defined in this Report as the nominal signal level obtained from an HDR camera and a 100% reflectance white card resulting in a nominal luminance of 203 cd/m2 on a PQ display or on an HLG display that has a nominal peak luminance capability of 1 000 cd/m2.

The publicly available, older Dolby docs that Google turns up have 100 as the reference level.

203 seems to be the current spec, and it looks like the final Wayland color protocol will communicate the reference levels, so guessing games won't be necessary.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The publicly available, older Dolby docs that Google turns up have 100 as the reference level.

Out of curiosity, do you have a link?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Dolby PQ reference white level" turned up this (and some various discussions that also reference this or their other, older specs).
https://professional.dolby.com/siteassets/pdfs/operational-guidelines-for-pq.pdf

The Netflix HDR mastering guidelines cite an SDR reference of 100 nits.

There seems to be quite a bit of conflicting information out there.

@swick
Copy link

swick commented Jun 19, 2024

Reworked the pull to just move the white level and headroom properties to the window and add Wayland support. The old patch with the color and luminance info is stashed away in another branch if needed.

Will you also hook that up to the wayland color-management protocol?

@swick
Copy link

swick commented Jun 19, 2024

The application renders using the colorspace associated with the swap chain, and the system converts the output to the range supported by each display.

That requires knowing where the white point is and the dynamic range of the content. If you tell apps that they have a headroom of something on all platforms but the reference white level is different on Windows than on wayland but the app just doesn't know that, you probably get an app that's broken on wayland.

I really feel like as long as you don't have a complete story (the app can both query the HDR parameters and set it on a Window) you're creating an API that will only help to create brokenness.

The latest change made things slightly better at least.

min/max_luminance has no meaning in Apple's dynamic lighting model. There is no way to relate absolute brightness to Apple's displays.

That's not true at all. You can map a SDR + headroom concept to absolute luminance by mapping 0 to SDR black level (0.2 nit), 1 to SDR white level (80 nit) and the value of headroom to 80 nits * headroom.

The reverse is however not true. If you have content that goes below SDR black level, this cannot be represented by SDR + headroom. You would have to tonemap the increased dynamic range in the blacks.

You should map the your content white level and headroom into the output white level and headroom, and be able to respond to dynamic changes in either. You need to be able to do this on Windows too, your content will be mastered to a specific display, which may not match the user's monitor.

This is only a Windows quirk. Wayland allows apps to pass through e.g. video content that is not limited to the actual display capabilities because there is no way for someone who masters the video to know the display capabilities of the display the content will be shown on. In the best case the compositor doesn't even have to touch the content either.

@swick
Copy link

swick commented Jun 19, 2024

One more note: Vulkan currently only has a single VkColorSpace based on PQ which is HDR10/BT.2100 which includes a viewing environment from which the reference white level of 203 nit is inferred.

@slouken
Copy link
Collaborator

slouken commented Jun 19, 2024

min/max_luminance has no meaning in Apple's dynamic lighting model. There is no way to relate absolute brightness to Apple's displays.

That's not true at all. You can map a SDR + headroom concept to absolute luminance by mapping 0 to SDR black level (0.2 nit), 1 to SDR white level (80 nit) and the value of headroom to 80 nits * headroom.

But on Apple EDR displays the white level dynamically varies based on the ambient lighting, so it's not actually 80 nits.

Anyway, the point is that you can't actually get the white level and luminance values in a cross platform way, and so SDL doesn't currently support it, to avoid applications trying to rely on it. It is possible to build your game such that it will work this way, but requires you do use tone mapping shaders. SDL provides these for the 2D renderer, allowing you to support HDR content out of the gate.

Video players passing different colorspace content through to the display is interesting, but not in scope for this PR.

@swick
Copy link

swick commented Jun 19, 2024

But on Apple EDR displays the white level dynamically varies based on the ambient lighting, so it's not actually 80 nits.

This is always the same problem with people who don't get that this is all relative to the viewing environment/reference white.

If you have sRGB content on Apple, the sRGB spec still says 0.2 to 80 nits for that content in the sRGB viewing environment with reference white level at 80 nits, yet the system will produces different luminances because the actual viewing environment and thus the reference white level is somewhere else.

You can also choose to map the reference white level of SDR + headroom content to 400 nits (or any other luminance) if you want by multiplying all the number accordingly: black level 0.2nit5=1nit, 80nit5=400nit, headroom80nit5=headroom*400nit.

the point is that you can't actually get the white level and luminance values in a cross platform way, and so SDL doesn't currently support it

Like shown above, I don't think that's the case.

Anyway, now you have a cross platform way to query something that you have to interpret and use in a platform specific way to make use of it because scRGB and PQ on Windows behave differently than on Mac and wayland with the c-m protocol.

Moves the HDR properties from the display to be per-window, and adds the frog_color protocol to enable HDR under Wayland.
@Kontrabant
Copy link
Contributor Author

Will you also hook that up to the wayland color-management protocol?

That's the plan, although, as mentioned in the protocol discussion, the main blocker is that if SDL uses the protocol to query the surface parameters, and any other part of the stack tries to use it to set/get the surface parameters, there will be a protocol violation.

@Kontrabant Kontrabant merged commit 2f276a2 into libsdl-org:main Jun 20, 2024
39 checks passed
@Kontrabant Kontrabant deleted the sdl_frog_color branch June 20, 2024 19:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Ability to get EDID/HDR metadata for a display
4 participants