Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quick Display adjustment PowerToy #1052

Open
crutkas opened this issue Jan 7, 2020 · 44 comments
Open

Quick Display adjustment PowerToy #1052

crutkas opened this issue Jan 7, 2020 · 44 comments
Labels
Idea-New PowerToy Suggestion for a PowerToy Product-Display management Refers to the idea of a display management utility power toy

Comments

@crutkas
Copy link
Member

crutkas commented Jan 7, 2020

End users are asking for a tool to quickly toggle settings of their monitors. Via taskbar, this should be a quick, easy thing.

Proposed spot, context menu inside PowerToys's systray icon

Edit from @Aaron-Junker: Also add shortcuts to do these actions.

Features

@dxgldotorg
Copy link

HDR would in fact be something one would toggle because to this day the SDR graphics are ugly when HDR is turned on.

@DavidGretzschel
Copy link

Profiles would be really great. I switch between different (multi)monitor-setups a lot.
Often I just connect the same monitor to a different port and everything changes to whatever the last configuration with exactly that port-allocation was (which might have been months ago). Scaling and monitor-positioning all of a sudden out of whack.
And who even knows what poor FancyZones makes out of that chaos.
If I could just set a hotkey to load a specific profile, that'd be solving an age-old problem.
If I want to show computer stuff to my grandmother, I could just hit a button and scale everything up a little. That would be helpful, too.
On my 7 inch 1920x1080 Laptop, I prefer 100% scaling, if I can find the bloody pen...... and 200%-scaling if I cannot.

Yes please, this would be excellent.

@SMG999
Copy link

SMG999 commented Jan 9, 2020

Will the Taskbar location (of the primary taskbar) be able to be moved to a non-primary monitor? Currently this is only possible to do via. drag and drop when the taskbar is not locked.

Will the refresh rate also be able to be taken into consideration? This is important, because you wouldn't want a 144Hz monitor (for example) unintentionally get set back to 60hz (The Windows default).

@crutkas
Copy link
Member Author

crutkas commented Jan 9, 2020

Will the Taskbar location (of the primary taskbar) be able to be moved to a non-primary monitor? Currently this is only possible to do via. drag and drop when the taskbar is not locked.
i don't think it would / should. On 1909, it looks like it stays with how the monitor currently is. I would imagine that is your default behavior. My comment above was more to ensure if you set something to primary in a configuration, on a dock state change, it reverts back to how it was.

Will the refresh rate also be able to be taken into consideration? This is important, because you wouldn't want a 144Hz monitor (for example) unintentionally get set back to 60hz (The Windows default).
In what scenario? a res change?

@crutkas
Copy link
Member Author

crutkas commented Jan 9, 2020

@dxgldotorg tweaked. thanks for the validation

@yuyoyuppe
Copy link
Collaborator

@crutkas
Copy link
Member Author

crutkas commented Jan 14, 2020

For brightness for desktop, could do it using DDC/CI protocol like @yuyoyuppe said in #718

interesting thread on reddit as well.
https://www.reddit.com/r/Windows10/comments/enp83s/why_there_is_no_built_in_brightness_slider_on/

@yuyoyuppe
Copy link
Collaborator

yuyoyuppe commented Jan 15, 2020

@crutkas yep, It's possible to access everything a display has with DDC/CI, including the input. In fact, that's what I've done in my prototype. I've had issues with EIZO displays, which implement DDC/CI over USB, therefore requiring a vendor-supplied software to control it.

@Arcitec
Copy link

Arcitec commented Jan 25, 2020

Thanks for considering this modern-day QuickRes tool. I am honestly in deep pain whenever I use Windows because I switch between SDR and HDR gaming and movies. This involves going into Windows display settings, setting the DESKTOP resolution to the desired one, then going into advanced and clicking on "List Monitor Modes" and setting the ACTIVE SIGNAL RESOLUTION (over HDMI) to the right resolution and right refresh rate, and then clicking on the HDR on/off toggle, and then going into the NVIDIA control panel to set the color to RGB 4:4:4 or YCbCr 4:2:2 and the output bit depth per channel (8/10/12). It probably takes 2 minutes of non-stop tedious navigation each time.

Here are some examples of resolutions/profiles I use on a daily basis often switching multiple times per day:

  • SDR Game: 1920x1080, 60Hz, 4:4:4 RGB, 8bit Color, HDR off
  • HDR Game: 1920x1080, 60Hz, 4:4:4 RGB, 10bit Color, HDR on
  • SDR Movie 1080p NTSC: 1920x1080, 24Hz, 4:4:4 RGB, 8bit Color, HDR off
  • SDR Movie 4K NTSC: 4K, 24Hz, 4:4:4 RGB, 8bit Color, HDR off
  • SDR Movie 1080p PAL: 1920x1080, 25Hz, 4:4:4 RGB, 8bit Color, HDR off
  • SDR Movie 4K PAL: 4K, 25Hz, 4:4:4 RGB, 8bit Color, HDR off
  • HDR Movie 1080p NTSC: 1920x1080, 24Hz, 4:4:4 RGB, 10bit Color, HDR on
  • HDR Movie 4K NTSC: 4K, 24Hz, 4:4:4 RGB, 10bit Color, HDR on
  • HDR Movie 1080p PAL: 1920x1080, 25Hz, 4:4:4 RGB, 10bit Color, HDR on
  • HDR Movie 4K PAL: 4K, 25Hz, 4:4:4 RGB, 10bit Color, HDR on

It's very important to always use 10bit color output when using HDR, otherwise (if 8bit) there will be dithering, which is just ridiculous.

I don't use any 30Hz modes, although most TV shows are shot at that, and if I watched TV shows I would have 30Hz profiles too (TV shows are ~29.97Hz in NTSC and 25Hz in PAL)... Sidenote regarding 4K: A HDMI 2.0b cable/card can handle up to 4K, 60Hz, 4:4:4 RGB, HDR off, or 4K, 60Hz, 4:2:2 YCbCr, HDR on, which I guess I would use if I ever wanted 4K 60Hz HDR gaming for some reason. Although this is just a sidenote, and I do not think 4K 60Hz HDR has any real purpose, since all HDR content is movies which are shot at 24, 25 or 30 FPS (Hz), at which point you can just use the correct output Hz and run full 4:4:4 RGB colors at 4K... (Yet another totally unrelated sidenote: HDMI 2.1 is so pointless for anything except 8K displays... Movies will probably never go above 24/25/30 FPS since that saves film and also gives the best "filmic/theater" look. Only games need high refresh rates, and currently most graphics cards struggle as hell at 4K 60Hz, which is the HDMI 2.0b limit, so let alone HDMI 2.1... Side-rant over...)

Anyway... You see? There's a ton of different settings. If we don't adjust all of them, then there's gonna be stutter/judder, or bad colors, etc. It's hell to set up each time.

Any tool which wants to provide excellent automation would have to support:

  • Resolution
  • Refresh Rate
  • Desktop Color Depth (24bit, 32bit), all that typical "TrueColor" jazz...
  • Output Color Depth (8bit, 10bit, 12bit)
  • Output Color Format (RGB 4:4:4 or YcbCr 4:2:2); this is currently provided via NVIDIA control panel (not by any Windows settings itself) but I hope PowerToys would be able to set them directly via that DDC/CI thing you mentioned...
  • Output Dynamic Range: Full (0-255) or Limited (16-235); again something that only NVIDIA control panel provides currently, hopefully DDC/CI can control it...
  • HDR on/off

If these controls are all possible (basically as saved profiles), then that would be utterly amazing. Fingers crossed that the controls are possible, in a universal way (Intel, nVidia, AMD)...

@crutkas
Copy link
Member Author

crutkas commented Jan 25, 2020

@VideoPlayerCode what are you doing that causes you to swap resolutions, refresh and more so often manually? Understanding the scenarios helps a lot when designing

@yuyoyuppe
Copy link
Collaborator

yuyoyuppe commented Jan 25, 2020

@VideoPlayerCode oh, thank you for such a detailed input and I feel your pain with setting all these different modes!

Most of the controls you've mentioned are being "passively consumed" by the display from the GPU and cannot be set from the DDC/CI, e.g. we can't expect to be able to change the resolution from a physical monitor's panel.

It's certainly possible to query their possible/current state using the DDC/CI though, which we could later use as an input to ChangeDisplaySettingsEx in the DEVMODE struct, so it's not an issue.

The things I've noticed lacking in DEVMODE, as you've also noticed, are those which are only available from the NVIDIA control panel. I'm not sure if it's prevalent in the current hardware, but both of my monitors are able to automatically detect and switch between the color formats given the GPU's input. So I guess you'll still have to go to the NVIDIA control panel for those 2 settings to switch GPU's output to the color format/range you need.

There's also a related issue of some manufacturers implementing DDC/CI protocol over a custom USB-based protocol etc., meaning we can't just use the standard Windows API to access the DDC/CI. I guess we could supply the tool with extensibility/plug-in system, so hardware manufacturers or enthusiasts can implement these missing features.

As @crutkas said, I would also like to hear your thoughts on the use-cases or possible design preferences.

@Arcitec
Copy link

Arcitec commented Jan 25, 2020

@VideoPlayerCode what are you doing that causes you to swap resolutions, refresh and more so often manually? Understanding the scenarios helps a lot when designing

Hi. Whenever I play any content from my computer, the primary goals are great clarity and zero stutter/judder. So I have to adjust all settings based on whether I'm playing a game, or watching a 1080p or 4K movie, and whether it's a PAL or NTSC movie, etc.

For clarity, that means setting the display to 4K if the movie is in 4K, or 1080p otherwise (to let my TV's incredible Sony X1 Ultimate upscaler do the upscaling to 4K internally, giving great results which almost looks as good as native 4K content). That is why I change between 4K and 1080p output resolution based on the movie I play. As for games, I always play them at 1080p (because I'd rather have 1080p + ultra graphics settings + rock steady 60fps, than janky 4K rendering with lower graphics settings). So that is why I have to change resolution all the time. Summed Up: Output the content at the same resolution as the content was shot at, if your TV has a fantastic upscaler.

Next up, stutter/judder elimination: This requires refresh rate changing. The only way to play movies with zero stutter/judder (smooth panning/smooth motion etc) is to ensure that your output framerate is either the EXACT same as the movie content, OR an integer (2x, 3x etc) multiple of it. For example, 24FPS movies will be stutter-free if played at 24Hz, 48Hz, 72Hz, 96Hz, or at 120Hz (because 24 times 5 = 120, so for example at 120Hz each movie frame would simply be shown 5 times and that ends up with a perfect 120Hz). However, always running your output at 120Hz is not good, because it requires a TON of HDMI bandwidth (forcing you to lower color quality etc), and it only provides stutter-free NTSC content (doesn't work for PAL content since 120/25 = 4.8), AND it basically makes the TV think that the content it's receiving is 120Hz. By always outputting movies/TV shows at the correct 24Hz (NTSC Movies), 25Hz (PAL TV/Movies) and 30Hz (NTSC TV), you ensure that the TV gets the best possible motion fluidity, by allowing it to understand the official content framerate that it's receiving, and thereby easily improve the fluidity via algorithms. So that's why I change output framerate - for fluid motion and to avoid stutter/judder. It makes a huge difference. No more janky "stutter stutter" panning shots or stuttery fast movements (cars etc) in movies. ;-) Summed Up: Use 24, 25, 30 FPS for movies/TV shows, and use 60 or 120Hz for games.

As for bit depth: The vast majority of content, both games and movies, is created as 8bit SDR. So outputting at exactly 8bit SDR allows, yet again, the TV to understand that it is getting 8bit (0-255 value range per color channel) content so that it knows that it can improve the look via its gradient smoothness algorithms (to avoid stairstepping patterns in gradients). If I was outputting 8bit SDR content but had my HDMI output set to 10bit, then Windows/nVidia would just fool the TV into thinking it's looking at 10bit content, when it really isn't, and that would lead to worse gradients. However, there's one purpose for 10bit output, when it comes to HDR content. For HDR, you need 10bit color channels, which gives you a 0-1023 value range per channel, thus letting the HDR content output with proper color values and the optimal clarity without dithering. So that's why I change bit depth between 8bit and 10bit all the time. Summed Up: HDR = use 10bit, everything else = use 8bit.

Lastly, there are two extra pieces, which have less effect, but still matter; "Output Color Format (RGB 4:4:4 or YcbCr 4:2:2)", and "Output Dynamic Range: Full (0-255) or Limited (16-235)".

The Color Format choice is simple: Always aim at using RGB 4:4:4, because it means that every pixel has its own exact color value. However, this costs a lot of bandwidth, which is why you must set the proper refresh rate and resolution (above) to ensure there is enough bandwidth left over to send 4:4:4 colors. If you use 4:2:2, you lose color clarity, although the loss is mainly visible on computer text (which need the per-pixel subpixel colors for antialiasing), and isn't really visible in movies (in fact, the UltraHD 4K BluRay spec uses 4:2:0). Here's a site about color resolution, with a good image demonstrating why you should aim for outputting 4:4:4 colors: https://www.rtings.com/tv/learn/chroma-subsampling. Again, this doesn't really matter for movie content (even 4:2:0 is fine there) but for computer output where you will be seeing text, 4:4:4 ensures great text clarity since the text subpixel antialiasing will work properly.

And finally, Dynamic Range: 0-255 is the obvious choice since it gives the full range of desktop colors, whereas 16-235 was an older, squashed color range made for movies, which isn't proper for game graphics output, desktop output, etc.

So, whew, I dunno if this is what you asked but this what every setting does and why I adjust them based on content.

If I can someday automate as much as possible of this, I would be very relieved. All of this takes so many steps in Windows display control panel, hehe.

@Arcitec
Copy link

Arcitec commented Jan 25, 2020

@VideoPlayerCode oh, thank you for such a detailed input and I feel your pain with setting all these different modes!

You're welcome. Thought it may help to get an overview of settings and why they're valuable.

Most of the controls you've mentioned are being "passively consumed" by the display from the GPU and cannot be set from the DDC/CI, e.g. we can't expect to be able to change the resolution from a physical monitor's panel.

Oops, I just looked up what DDC/CI really is. I see that it's almost like EDID; basically, the display uses DDC/CI to tell the GPU its display capabilities. So it's a display->to->GPU direction communication. Definitely not what I meant. I thought it was some way to tell the GPU to change its output settings, to reach "hidden" GPU features such as color range, bit depth, etc. Forget what I said earlier about DDC/CI! :-)

It's certainly possible to query their possible/current state using the DDC/CI though, which we could later use as an input to ChangeDisplaySettingsEx in the DEVMODE struct, so it's not an issue.

Sounds like you're thinking of a way to auto-apply profiles based on what the display says via EDID/DDC/CI? Hmm, I would put that on a "future/maybe never" todo-list if I were you, because it doesn't sound useful for the vast majority of people.

The things I've noticed lacking in DEVMODE, as you've also noticed, are those which are only available from the NVIDIA control panel. I'm not sure if it's prevalent in the current hardware, but both of my monitors are able to automatically detect and switch between the color formats given the GPU's input. So I guess you'll still have to go to the NVIDIA control panel for those 2 settings to switch GPU's output to the color format/range you need.

Yeah, I kinda assumed it was a proprietary thing that the nVidia Control Panel sends to the GPU via a driver message, rather than a standard thing. Well, as long as I can automate the resolution, refresh rate (the same thing that can be seen in the Windows Display "Advanced: List all display modes" list, so should be possible to set by PowerDisplay), and the HDR on/off toggle, then that's already a HUGE help (would avoid having to fiddle with refresh rate via the List all display modes pane).

Hey, this reminds me of something semi-related: Some people would also love to be able to save display DPI settings in the profiles. That has uses for things like "4K @ 300% for Reading" profiles, etc.

Oh and it would also be useful, if possible, to be able to optionally save (Win+P) "Project" settings in the profile, ie the choices of "PC Screen Only", "Extend", "Second Screen Only", etc. But if that's not possible, it's no big deal at all, since I very easily hit Win+P and set it to "Second Screen Only" to dedicate all GPU power to the external display.

There's also a related issue of some manufacturers implementing DDC/CI protocol over a custom USB-based protocol etc., meaning we can't just use the standard Windows API to access the DDC/CI. I guess we could supply the tool with extensibility/plug-in system, so hardware manufacturers or enthusiasts can implement these missing features.

Oh yes, that's brilliant idea. It opens the door for missing features of all kinds. And maybe even official support by nVidia or at least some reverse engineered plugin.

Definitely have a plugin/extension feature in mind from day 1 when designing the system, so that it doesn't take a ton of work to add later. If I were you, I'd even design the official core features as a plugin, so that it's all a robust plugin system. Basically, have the core just be a "per-monitor plugin-settings collection" (so the core just handles remembering the displays, and which plugin-settings (if any) to apply for each display), and let each plugin provide the dropdowns/checkboxes/etc for the settings-GUI along with the handling of applying parameters on the displays etc. And then write the standard "per-monitor resolution, refresh rate, HDR, DPI, etc" plugin in that system. That way, other people can add plugins that provide additional settings that are yet again saved per-display. It would be brilliant.

As @crutkas said, I would also like to hear your thoughts on the use-cases or possible design preferences.

Alright, hehe, I hope I provided enough use-case examples for each setting above.

And as for general design ideas, I'm imagining the PowerToys "master app" having a PowerDisplay pane, which has a dropdown with all of the user's saved profiles (with support for systemwide profiles and per-user profiles).

Selecting a profile in the list, or creating a new profile, populates the rest of the GUI with all dropdowns/controls for each setting: Detected Displays (remembered by their EDID display name/serial number), Per-Display settings (all of the main ones; Resolution, Refresh Rate, etc). And the ability to choose which displays to remember settings for in the profile (ie just the "Sony TV", or just the "primary monitor", etc); any display that is "unticked" in the profile would not get modified later when loading that profile. This idea means that multi-display users could have a bunch of per-display profiles, where loading let's say the "Sony display @ 4K" profile would just set the Sony display to 4K but not touch the primary display at all, etc...

Underneath all of that I'm imagining a button to Apply (to make the settings take effect immediately, to easily test them while making profiles) and a button to Save (the profile).

When the user has created a profile using that tool GUI, they'll be able to do four things:

  1. Right-click the PowerToys tray icon, select the PowerDisplay submenu, and click any of their profiles there to instantly apply it.

  2. Send a command like powertoys.exe /powerdisplay "1080p Game" to apply that (named) profile via command line.

  3. Press a keyboard shortcut to get a big on-screen overlay (kinda like the Shortcut Guide) powertoy which lists all profiles and lets the user apply one instantly by clicking that profile.

  4. Binding specific profiles to keyboard shortcuts. I could easily imagine something "rarely used by other apps" like "Ctrl+F10/F11/F12" permanently being the way to easily set my most used profiles.

Basically, having a GUI for saving named profiles containing the desired display settings, and then applying them primarily via the notification tray icon's right-click menu, command line, or keyboard shortcuts.

These are just some proposals. I hope some of these ideas help inspire you.

@yuyoyuppe
Copy link
Collaborator

@VideoPlayerCode

To sum up your first reply post, since you understand all technical details, you strive to extract all the best possible quality from your content-watching experience and would like to have all the settings in one handy place.

Since we want to automate this, I'm thinking of having user-defined triggers for various events, e.g. switching active window to "mpv" pattern => active movie profile etc.

Sounds like you're thinking of a way to auto-apply profiles based on what the display says via EDID/DDC/CI? Hmm, I would put that on a "future/maybe never" todo-list if I were you, because it doesn't sound useful for the vast majority of people.

Nah, I was just ruminating on the technical implementation. The idea is to provide all the switches possible, and a user shouldn't care whether we set it via DDC/CI API or other WinAPI. However, I'd like to have an ability to schedule/smooth transitions between profiles, which we'll allow to have hardware blue light filter at night feature etc.

Some people would also love to be able to save display DPI settings in the profiles

Having a DPI setting is certainly a good point.

Basically, have the core just be a "per-monitor plugin-settings collection" (so the core just handles remembering the displays, and which plugin-settings (if any) to apply for each display), and let each plugin provide the dropdowns/checkboxes/etc for the settings-GUI along with the handling of applying parameters on the displays etc. And then write the standard "per-monitor resolution, refresh rate, HDR, DPI, etc" plugin in that system. That way, other people can add plugins that provide additional settings that are yet again saved per-display. It would be brilliant.

Yep, that's my thinking as well. It's only natural to do it that way given the plethora and dynamic availability of EDID settings.

My main concern is to not overcomplicate the GUI with scheduling/triggers machinery. That part definitely needs some thinking through.

Thanks again for your contribution to this.

@DavidGretzschel
Copy link

DavidGretzschel commented Mar 1, 2020

"show only on [monitor] X" would be good to have on a keybinding, as well.

@crutkas crutkas added this to the Suggested Ideas milestone Mar 9, 2020
@shanselman
Copy link
Member

Maybe setres.exe? That's in Windows Server Core today.

@crutkas
Copy link
Member Author

crutkas commented Mar 25, 2020

From #1680 and @shanselman's suggestion, we should think about this with commandline access as well

@pma9
Copy link

pma9 commented Jun 12, 2020

Not sure if this was addressed here or if this could even be fixed by PowerToys but I have found an issue with multiple displays that I can't figure out. There's a feature in Windows 10 where you can "Choose a presentation display mode." Based on this article and many more, when using "Second screen only" the main display should be turned off and the main display is set in the Display Settings. Although I have for example my monitor number 2 as my "Main Display", the presentation display mode has chosen monitor 1 to turn off instead when choosing "Second screen only".

Not sure if it is going based on the number that is it identified in the display settings or not but the way I might try to reproduce on someone else's computer is to set the monitor numbered 2 to the "Main Display" and use the hotkey: Windows Key + P to set the presentation mode to "Second screen only". You should expect monitor 2 to turn off and not monitor 1.

I guess why I'm posting this here is because I'd hope this new PowerToys features will implement an alternative way of choosing which displays to "turn on/off"

As I'm writing this, I can see that I should also go to Microsoft themselves to report this since it seems like a bug. So I'll go ahead and do that and add a link here. Side note, I have also set the correct "Primary Display" in Nvidia Display Settings also and this still occurs.

EDIT: Here's the link. I used the Feedback Hub and added a video for them but not sure if you all can see the video.

@JacobDB
Copy link

JacobDB commented Jul 8, 2020

Adding to this, I'd absolutely love a way to control brightness on external monitors. There's some software out there that does this, I think via a semi-transparent overlay.

@Jason-GitH
Copy link

ColorControl incorporates some of these features and is a very handy tool: https://github.com/Maassoft/ColorControl

@Jason-GitH
Copy link

I made this Issue request regarding allowing more control over SDR gamma transfer to HDR. I don't know if this would be a better location to put it: #29949

@lanceyliao
Copy link

https://github.com/xanderfrangos/twinkle-tray
for reference only

@ziasquinn
Copy link

also for reference: https://sourceforge.net/projects/monitorswitcher/

I would love this feature in Powertoys. It would make my year

@create-juicey-app
Copy link

any progress happening?

@0Chan-smc
Copy link

0Chan-smc commented Apr 18, 2024

Any progress? It would be great to see this feature come to Powertoys.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Idea-New PowerToy Suggestion for a PowerToy Product-Display management Refers to the idea of a display management utility power toy
Projects
None yet
Development

No branches or pull requests