Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ability to get EDID/HDR metadata for a display #6587

Closed
misyltoad opened this issue Nov 23, 2022 · 27 comments · Fixed by #9914
Closed

Ability to get EDID/HDR metadata for a display #6587

misyltoad opened this issue Nov 23, 2022 · 27 comments · Fixed by #9914
Milestone

Comments

@misyltoad
Copy link
Contributor

Hello!

It would be nice to be able to get an EDID and/or extract the HDR metadata from a display using SDL.

This is needed for any HDR applications running under Vulkan that want to be portable in any form, or simply do not want to use DXGI to query IDXGIOutput6::GetDesc1 to get the colorimetry/metadata info.

I have written code for getting an EDID from a HMONITOR on Win32 here in DXVK:

https://github.com/doitsujin/dxvk/blob/master/src/wsi/win32/wsi_monitor_win32.cpp#L226

To parse the HDR Metadata + Colorimetry in DXVK, I use a library called libdisplay-info here, SDL could roll it's own EDID parsing code, or use that. https://github.com/doitsujin/dxvk/blob/master/src/wsi/wsi_edid.cpp#L14 (for reference about what colorimetry / metadata is needed by an app).

@slouken
Copy link
Collaborator

slouken commented Nov 23, 2022

It seems like returning the raw EDID might be generally useful, and could be done across multiple platforms.

@flibitijibibo
Copy link
Collaborator

Was about to draft this same issue, glad I searched first... some additional notes:

There is an EDID parser already but it's mainly for X11: https://github.com/libsdl-org/SDL/blob/main/src/video/x11/edid-parse.c

X11 already pulls in the EDID, turns out it's just used to get the monitor name and nothing else (lol):

MonitorInfo *info = decode_edid(prop);

So, Joshua's Win32 implementation combined with SDL's X11 implementation at least gives us 2 already-finished backends, and if we want to return more than the raw blob we already have a parser if we don't want to pull something like libdisplay-info in. I'm not sure if Wayland or Cocoa make this easy to get without it getting pre-parsed for us (i.e. we shouldn't need wp_color_management_v1 to get this).

@flibitijibibo
Copy link
Collaborator

flibitijibibo commented Jan 16, 2024

Will leave Apple targets to someone else but it seems the EDID is only exposed on Wayland via a kwin-specific protocol:

https://github.com/KDE/plasma-wayland-protocols/blob/master/src/protocols/kde-output-device-v2.xml

I suppose gamescope could implement this too, but maybe it's worth extending xdg_output or something...

At the same time I'm checking with FNA's console partners to see how Xbox/PlayStation expose display metadata, if at all... if they only expose it as pre-parsed data that may push us towards exposing this as SDL_GetDisplayHDRMetadata or something. EDIT: Literally as I hit send we got the answer for Xbox luminance, still not sure about color gamut: https://learn.microsoft.com/en-us/gaming/gdk/_content/gc/reference/system/xdisplay/structs/xdisplayhdrmodeinfo

@misyltoad
Copy link
Contributor Author

Gamescope provides the EDID in a path in an atom on the root window. Its called like GAMESCOPE_PATCHED_EDID_PATH iirc

@Zamundaaa
Copy link

Will leave Apple targets to someone else but it seems the EDID is only exposed on Wayland via a kwin-specific protocol

FYI we will likely no longer send the EDID in that protocol in the near future; it was only part of the protocol because we re-used our output configuration infrastructure for X11 on Wayland (which is no longer the case) and that needed to parse the EDID for identifying outputs. In general we may just break backwards compatibility or stop supporting the kde protocols at any time, so I'd advise to not use any of them outside of Plasma projects.

In general I think that exposing the EDID and having apps parse it is a pretty bad API and I'd really like to avoid leaking it outside the compositor in any way - it would just end up in KWin (or SDL) needing to generate fake EDIDs to make apps that use it at least somewhat behave with our color management settings. KWin 6.0 exposes color management information through two protocols though, which you can use:

@flibitijibibo
Copy link
Collaborator

I suppose that all makes sense to me, but I'll have to defer to Joshua on exactly what we need - currently dxvk (for example) pulls the whole EDID to parse but I don't think it's actually exposed in any way, so it should be possible to replace that path (or just isolate it to Win32) if we're able to get all of the data some other way.

@flibitijibibo
Copy link
Collaborator

flibitijibibo commented May 27, 2024

We're doing a second pass on HDR support in SDL_gpu and are getting slightly blocked on this - we can support formats/colorspaces no problem, but they aren't useful until we can get the EDID's color metadata. Did anything happen on the protocols side in the meantime? I can try to prioritize the Win32/X11 side of things if it helps narrow down what our exact needs are.

@slouken
Copy link
Collaborator

slouken commented May 27, 2024

What specific data do you need? SDL3 has some HDR metadata exposed as properties on the display.

@Kontrabant
Copy link
Contributor

Did anything happen on the protocols side in the meantime?

The upstream colorspace protocol has seen some iteration, but it still shows no signs of being ready for merging. Would the frog protocol be able to provide everything that you need? That's probably the path of least resistance at this point.

@misyltoad
Copy link
Contributor Author

It should yes.

@misyltoad
Copy link
Contributor Author

The feedback is surface based rather than display based though. I am very opposed to having it be display based (so is the other protocol). It just leads to terrible app design.

@flibitijibibo
Copy link
Collaborator

Yeah, we could probably work with the FROG_* stuff.

The full list of fields (@Joshua-Ashton can correct me if I'm wrong):

  • Red Primary (float2)
  • Green Primary (float2)
  • Blue Primary (float2)
  • White Point (float2)
  • Min Luminance (float)
  • Max Luminance (float)
  • Max Full Frame Luminance (float)

I see the property for white point, not sure about the rest though.

@slouken
Copy link
Collaborator

slouken commented May 27, 2024

So, for SDL, does it make sense for these properties to be on the window?

On Windows these properties are on the display, but we could expose them on the window and change them when the window moves displays.

For Gamescope, they're per-surface presumably because the compositor can compose surfaces with different colorspaces, right? When would you expose different values for different surfaces?

@flibitijibibo, what is the actual set of fields you need for the GPU API, and why do you need those values? Are you converting from sRGB to the monitor primaries? On Windows DWM and the display driver chain work together to convert sRGB content to the actual monitor colorspace, and that shouldn't be done separately by the application.

@Kontrabant
Copy link
Contributor

Kontrabant commented May 27, 2024

The feedback is surface based rather than display based though. I am very opposed to having it be display based (so is the other protocol). It just leads to terrible app design.

I noticed, and it would just be a matter of adding properties to the window vs the display, which might make more sense overall since, in most cases, client apps are going to be querying what display the window is on, in order to get the properties ID for the display that the window is on to retrieve the HDR properties, which can get messy in multi-monitor setups where windows move between displays and the backend may know better than the client app as to what properties the window should be using. It's easy to update the window properties when the window changes displays on platforms that only expose it per-display, but it gets messy in the opposite direction.

I completely agree with you on that aspect of API design.

@flibitijibibo
Copy link
Collaborator

flibitijibibo commented May 27, 2024

I suppose in SDL's case we could support getting window-specific data via SDL_GetDisplayForWindow (or whatever it's called now), so it's still encouraging checking per-window but reflects how OSes expose it?

I'll try and get the exact requirements as part of GPU's HDR test but from what I understand the old way of doing things was to set the HDR metadata (via DXGI or VK_EXT_hdr_metadata), but now it's been flipped to get the metadata so the application does all the color correction on its own:

https://learn.microsoft.com/en-us/windows/win32/api/dxgi1_5/nf-dxgi1_5-idxgiswapchain4-sethdrmetadata

@Zamundaaa
Copy link

The upstream colorspace protocol has seen some iteration, but it still shows no signs of being ready for merging

I can't make any guarantees, but I'm trying to have it be done in time to ship it in Plasma 6.2 (in ~4 months if the release schedule doesn't get changed). Using the frog protocol could still make sense though, so you have something concrete and that works with Plasma 6.0 ootb to work with.

One issue with using the upstream protocol for getting color management information is that creating multiple color management extensions for one Wayland surface results in a protocol error, and there's no "get properties only" part without such an error right now. If both SDL and the Vulkan driver use the protocol, the app crashes.

It's also relevant that Vulkan may get an API to get the desired HDR metadata for the surface as well. Does it make sense to have the same thing in SDL even once that's in place?

I suppose in SDL's case we could support getting window-specific data via SDL_GetDisplayForWindow (or whatever it's called now), so it's still encouraging checking per-window but reflects how OSes expose it?

Output and surface color information can be different, and 99% of apps should only use the one that's per surface. The output stuff is really only meant for some special use cases with ICC profiles.

I understand the old way of doing things was to set the HDR metadata (via DXGI or VK_EXT_hdr_metadata), but now it's been flipped to get the metadata so the application does all the color correction on its own:

Both are needed, for different use cases. If an app wants to play back existing content like videos, providing the HDR metadata of the video is generally better, but if it's a game that's rendering stuff itself, targeting the compositor's desired color information to avoid additional color transformations is best.

@slouken
Copy link
Collaborator

slouken commented May 27, 2024

Most movie players will have built-in colorspace conversion and will target the compositor's desired colorspace like games.

@misyltoad
Copy link
Contributor Author

misyltoad commented May 27, 2024

So, for SDL, does it make sense for these properties to be on the window?

I think so, on Windows SDL'd use the display the window is on, on Wayland, you'd use the surface's data. But both would be exposed via a call on the window.

For Gamescope, they're per-surface presumably because the compositor can compose surfaces with different colorspaces, right? When would you expose different values for different surfaces?

Right, and it avoids the docking/undocking case, we just send a new thing to the game, and they don't have to know anything about the output changing.

Unfortunately DX games right now would just read the old data, but that's just unavoidable as they only query it once anyway =(

@Zamundaaa
Copy link

Most movie players will have built-in colorspace conversion and will target the compositor's desired colorspace like games

Yes, but we're working hard to not make it be that way anymore, at least not on Linux. The compositor can do much more efficient color space conversions than apps, specifically for videos the difference in power draw can be huge.

@misyltoad
Copy link
Contributor Author

Yeah The difference between doing the HDR color transforms on Deck OLED in a shader vs scanout is like 1W. Without that, the feature probably wouldn't have been shippable in my eyes.

(Tetrahedral 3D LUTs in software are expensive! We could probably do slightly better than 1W though...)

@flibitijibibo
Copy link
Collaborator

We're still writing up our HDR/tonemapping example, but it'll eventually be here:

https://github.com/TheSpydog/SDL_gpu_examples/tree/tonemap

From what I've gathered in this thread, the optimal color pipeline is to let the application do its own thing and pass the mastering HDR metadata to the compositor (either via DXGI or vkSetHDRMetadataEXT), which can do proper hardware-accelerated color transforms, but there are also scenarios where this is not available and so the application should maybe have fallback tonemapping to ensure correctness even if energy efficiency takes a hit.

From SDL_gpu's POV that's actually not so bad if this is the case; we should be able to expose something like SupportsHWTonemapping() which can determine if calling a SetHDRMetadata() function would actually do something, and if it doesn't then the application knows to do extra tonemapping (or at least suggest it, in case it's too expensive) and can query display colorimetry accordingly.

@Kontrabant
Copy link
Contributor

Threw together a pull that exposes additional HDR metadata and does so per-window, in addition to adding support for the frog color protocol.

@Sewer56
Copy link

Sewer56 commented Jun 3, 2024

I stumbled across this issue while researching for a cross platform way to query display video modes and then associate them with a unique device identifier.

In my case, I wanted to be able to pick a setting like Display Resolution on a native Linux host, save it, and have a process inside Wine/Proton pick it up. And ideally, vice versa.

Being able to dump the raw EDID would be handy here, as bytes 8-15 (inclusive) can be used to uniquely identify a display from what I've gathered so far. These being manufacturer ID, product ID and Serial No.

The use case I listed here is a bit unusual of course, but I figured that at the very least, I could provide another use case.

@misyltoad
Copy link
Contributor Author

That doesn't work -- serial IDs are not always filled in and need not be unique.

Really the app shooouldn't need the real EDID at all. Parsing that is also hell. Not all platforms expose that so it's not a good way of going about it imo.

@Sewer56
Copy link

Sewer56 commented Jun 3, 2024

Really the app shooouldn't need the real EDID at all. Parsing that is also hell. Not all platforms expose that so it's not a good way of going about it imo.

Truth be told, I agree; especially with it being really hard to get on some platforms.
But at the same time, I feel like the options are limited in my specific case.

Can't really think of something that uniquely identifies a display and would also yield the same identifier across multiple OSes; it's a really tough problem; and in an area I don't have very much experience in. I would like to know though.

In my case, the identifier doesn't have to be globally unique, just unique enough to distinguish a monitor that a particular user has on one of their owned devices. If it can at least narrow it down to a specific monitor owned by a specific user (even if that user has 2 of the same monitor and they happen to return the same EDID), that would be good enough for me; since it's unlikely a user will run the same display at different resolutions or refresh rates. Obviously it's far from ideal, but I'd have to do with that limitation.

What I really want is being to select a monitor in a native Linux App, save it to a file, and have a process inside Proton/Wine pick that setting up, and then spawn a game on the correct monitor with user's chosen resolution and refresh rate. I sometimes make patches (mods) for older games that fix them up to more correctly work on modern machines.

@Kontrabant
Copy link
Contributor

One issue with using the upstream protocol for getting color management information is that creating multiple color management extensions for one Wayland surface results in a protocol error, and there's no "get properties only" part without such an error right now. If both SDL and the Vulkan driver use the protocol, the app crashes.

This seems like it should be addressed before finalizing the protocol, as it could be a major issue. This is most certainly a protocol that some applications and libraries will want to interact with directly to get metadata, and having things crash because "whoops, the driver, which is opaque to userland applications, tried to use it too" is not good.

It's also relevant that Vulkan may get an API to get the desired HDR metadata for the surface as well. Does it make sense to have the same thing in SDL even once that's in place?

Vulkan isn't the only graphics API across all the platforms that SDL supports (and the aforementioned extension doesn't exist yet), so, IMO, yes, it makes sense for SDL to provide an easy way to expose this on platforms, similarly to how it currently provides an easy way to get ICC data without applications having to write platform-specific code to do so.

@misyltoad
Copy link
Contributor Author

One issue with using the upstream protocol for getting color management information is that creating multiple color management extensions for one Wayland surface results in a protocol error, and there's no "get properties only" part without such an error right now. If both SDL and the Vulkan driver use the protocol, the app crashes.

That's dumb, they should fix that.

The frog protocol has no such restriction.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants