-
Notifications
You must be signed in to change notification settings - Fork 330
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add option to enable/disable GPU side temporal dithering to help with eye strain + CLI #2766
Comments
Great news! This will help a lot in generating more awareness about these dithering issues, hopefully enough for Apple to notice. Thanks! I reckon though that disabling DCP-controlled dithering on external monitors is just if not more important than the built-in display. The reason being some users are still facing eyestrain issues with the built-in panels (PWM, Pixel Inversion, other types of flicker or light issues), and are using external monitors as their primary display with these machines. |
I agree. By disabling DCP-controlled dithering you mean forcing a lower connection bit depth (and thus at least making it possible for the display not to use temporal dithering / FRC)? That would be great indeed. But for that you'd need to influence somehow how the connection is negotiated. Simply forcing the framebuffer to a lower bit depth might not in itself achieve the desired effect (?). Plus a display can do whatever it wants, it is very difficult to control how an external display behaves as much depends on the hardware, controller board, firmware etc. Right? |
No sorry my comment was regarding the visibility of the "GPU Dithering" option in the app. It should be just as visible for both internal and external displays IMHO. I use DCP/GPU interchangeably to refer to this type of dithering that Apple applies to the image before sending it off the TCON or external display. Though it would be great if we can somehow hijack the process/handshake that determines available bandwidth and force a bit depth that way. I tried overriding I also tried overriding the There's got to be a way. How do you force the framebuffer to have a lower bit depth? I would like to experiment with that. I agree with you regarding external display behavior. |
I think the "7" for the depth field shoul correspond to There are a bunch of DP, link and DPCD related stuff in IOKit which I am yet to explore. Maybe @joevt knows more about these (but afaik he is not working on Apple Silicon stuff):
So what I want to say is that there seems to be various stuff to set connection parameters and also to read DPCD data (similar to how it is possible on Intel I guess, AllRez has the means to do that on that platform) to at least figure out the current connection link configuration? |
(parallel to this there is a discussion here - aiaf/Stillcolor#2 (reply in thread) - maybe these should be consolidated :)) |
"No sorry my comment was regarding the visibility of the "GPU Dithering" option in the app. It should be just as visible for both internal and external displays IMHO. I use DCP/GPU interchangeably to refer to this type of dithering that Apple applies to the image before sending it off the TCON or external display." -> right. I added the option to all displays, but by default on external displays you need to hold the OPTION key to appear (as with other framebuffer related stuff). I am a bit unsure about the nature of GPU dithering (spatial vs temporal - the former should not cause any eye-strain). On external displays I don't fully understand how it could be a viable temporal dithering, especially at low refresh rates, see my comment here: aiaf/Stillcolor#2 (reply in thread)). But I did not do any scientific tests and slow motion videos to confirm what is happening, so if you've done that (did you?), that should settle it of course. |
@waydabber , being able to get the DPCD info on Apple Silicon would be interesting, but it's probably a lower level than is needed for pixel format / depth. I haven't looked into ColorElement and TimingElement for a year. |
@waydabber Disabling DCP dithering actually is very important on external displays, especially older ones: I have two external monitor (one Samsung from late 2000s and another BenQ from 2015) that each seem to use their own 6bit + FRC algorithm to display """8bit""" color. When connected to Intel Macs, the monitors display relatively "clean" images (except for the monitors' own FRC "moving static" which I still end up noticing and find pretty annoying anyway)... But when connected to an Apple Silicon Mac, there's super obvious "dotted patterns" showing up on certain colors, and it is definitely temporal because there are some colors that straight up visibly, obviously flicker on these monitors — again, this only happens when connected to Apple Silicon, but not on Intel Macs. This is super noticeable on Dark Mode windows or when changing the shade of a gray object in apps like Figma. (It doesn't matter whether the output is forced to RGB or YPbPr.) I mostly suspect this is coming from the FRC pattern of the monitor "clashing against" the one generated by Apple's video output. From the tests I've read that @aiaf has done with capture cards, the GPU is in fact generating moving dithering patterns when the desktop is supposed to be still, even on 60hz external video out. There is a BenQ article about "monitor flickering when connected to M1 Macs" that doesn't really give any real solution, that I'm pretty sure was written due to someone else noticing the same issue: But now, for the first time in years, using the new disable dithering method actually fixes this issue with both external monitors, and probably any other 6bit + FRC monitor! My M1 Mac now generates the same "generally clean" output that I'm used to getting from my Intel Macs. This is why I believe leaving this option visible by default is still very important for external monitors, as it's fixed a very obvious and noticeable monitor output issue that seemed unsolvable to me for years :) |
Thank you for these tips! This looks like a pain to test. I'll probably try my hand at it if I can get 100% confirmation the MBP built-in panels are 8-bit+FRC. |
@joevt that's a very useful post, thank you! The values you showed match the ones reverse engineered by the Asahi Linux project https://github.com/AsahiLinux/linux/blob/bd0a1a7d465fcb60685a2360565ed424bafff354/drivers/gpu/drm/apple/parser.h By any chance do you know which process writes ColorElements on the IOMobileFramebuffer registry entry? |
@aiaf, I don't know which process creates those. Might have to
Some kexts have a IOUserClient that a user app can communicate with. An IOFramebuffer (on Intel Macs) may have a user client with some methods (called through the IOKit framework or other user application accessible library) that only allow the WindowServer process to access. If you look at the output of On Apple Silicon, IOMobileFramebuffer replaces IOFramebuffer and IOMobileFramebufferUserClient replaces IOFramebufferUserClient (and/or IOFramebufferSharedUserClient?). But Apple Silicon also has IOFramebuffer (using IOServiceCompatibility) for compatibility with older software. AppleSilicon has a WindowServer process so I would check that and the frameworks it uses. |
It should be WindowServer. There is also the IOMFB_bics_daemon (in usr/libexec) which is started earlier than WindowServer during boot and seemingly has the calls inside that indicates it might configure framebuffer related things and can write stuff to the registry but it only deals with built-in displays based on what seems to be inside it plus it is never listed as client for any dispext framebuffers (+ on Macs with no built-in screen it is not attached to any even though the process is running). A process can become a client by using IOMobileFramebuffer framework's I have no idea how to figure out either which ColorElements is active for the current TimingElement (can only narrow it via circumstancial evidence - it would be great to know exactly, I wanted to add these details to BetterDisplay's Display Information... block) and how to change the color mode - but one can somewhat influence the selection by overriding the windowserver config file specifying some criteria (associated to a display's uuid) which is telling. |
to the original issue: added Examples (set all, set specific display, get, toggle):
|
@joevt thank you for these tips. I've been poking around for a couple of days now. Lots of progress. Also thank you for AllRez- it's a tremendous piece of work. Do you know if
and
or do you think the restrictions are done in some other way? Perhaps these shouldn't matter if not running in Sandbox mode I reckon. @waydabber thank you for these tips! Going to try Frida with WindowServer IOMFB_bics_daemon and hopefully get something out of it. Re. ColorElements, my best guess is that the highest scoring element (barring any other disqualifying criteria like DSC support) is the one in use. I have no proof of this. |
According to my experience, only a small amount of |
@aiaf - I haven't researched IOMobileFrameBuffer. The source code for IOFramebuffer says the first app gets access to the IOFramebufferUserClient - which is usually always WindowServer. A kext like WhateverGreen can make patches to allow other apps access. AllRez can use my WhateverGreen fork to do some things. |
Thanks @joevt. @waydabber I think the idea is that those locked down calls are restricted to certain processes like WindowServer only. |
@aiaf - I did not experiment with this on Intel at all so I thought @joevt is explaining that on Intel the first process that connects can be an UserClient for the IOFramebuffer (Intel) - which might not necessarily be case with IOMobileFramebuffer (thus the screenshot with multiple clients). But now that I reread it, I probably just misunderstood. 😀 Things being locked down happens somewhat universally in private frameworks and I had the impression that lockdowns like that is managed by features like AMFI (which afaik can be turned off) and maybe other specific measures (such a specific measure could be what @joevt is mentioning of course). Note: generally, even though it is always important for insight to understand what can be done with various things patched, standard protections turned off etc, from an app developer's perspective whenever something can't be accessed from user space on a vanilla installation, the usefulness of the discovery becomes academic as the feature can't be added to an app that is aimed at a more general audience. 😞 |
An IOService can have many user clients of one or more types. A process indicates what kind of User Client they want to use. The IOService decides if it will allow the connection to be made. An IOFramebuffer can have many IOFramebufferSharedUserClient but only one IOFramebufferUserClient (used by the Window Server). Any process can request a IOFramebufferSharedUserClient. But only the Window Server (or the first process) will be allowed to request a IOFramebufferUserClient. AllRez uses |
I just discovered this somewhat by accident today. I got an EIZO CS240 that is supposed to have true 10bit display afaik. I was wondering about vertical lines in certain grey tones. When turning off "GPU dithering" these disappear. I did not find information on what exactly is happening here. Is this really forcing 8 bit?? Does this mean, the 10 bit before were only achieved on software level, even though the panel should support 10 bit? EDIT: |
10bit refers to the 10bit framebuffer depth (so the video memory has colors represented using 10bits). The display connection however may be 8 bits only. You might need to check the display's OSD if it can provide you info on the connection bit depth. If not, then you can check the display connection logs (this might be a bit Mac specific and low-level) to see what color mode was negotiated exactly. |
Oh my goodness, disabling GPU Dithering fixed the weird banding issues I was having exclusively on Apple Silicon Macs with my Asus PG35VQ in SDR mode! Before I used BetterDisplay to force higher brightness in HDR as that would fix it but at the cost of color accuracy, but SDR mode always had horrible banding. Weirdly never showed up when plugging in my iPhone or iPad, and wasn't on my old Intel Mac. |
BD has some eye-strain related features: https://github.com/waydabber/BetterDisplay/wiki/Eye-care:-prevent-PWM-and-or-temporal-dithering
A new method discovered by @aiaf allows us to simply turn off temporal dithering on the GPU side with visible effects on the built-in displays of Apple Silicon Macs (note: since this works on IOMobileFramebuffer only, this is an Apple Silicon-only feature).
https://github.com/aiaf/Stillcolor/tree/main
Note: the toggle will be added to the Image Adjustments section for built-in displays. For external displays the option will reveal itself when holding OPTION (testing on various external displays I personally saw no real difference as most dithering happens at the display side).
Note: this will of course be a non-Pro since the method and Stillcolor is MIT licensed. All credits go to @aiaf for this discovery.
The text was updated successfully, but these errors were encountered: