Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Output width/height wrong with Nvidia borders/underscan #3708

Open
dj-foxxy opened this issue May 27, 2019 · 10 comments

Comments

Projects
None yet
4 participants
@dj-foxxy
Copy link

commented May 27, 2019

I'm submitting a…

[x] Bug
[ ] Feature Request
[ ] Documentation Request
[ ] Other (Please describe in detail)

Current Behavior

  1. Via nvidia-settings, configure the outputs so there are borders around the image (sometimes called underscore, or offset). E.g.:
DPY-1: 2880x1620 @2684x1509 +0+0 {
  ViewPortIn=2684x1509,
  ViewPortOut=2684x1509+98+55
}
  1. i3 does not detect the correct output output size of 2684x1509, instead it has
    2880x1620 (i.e., ignoring the underscore). As a result, windows and bars overflow the to the right and bottom. I can't move my mouse outside the visible screen, indicating that X knows the right size.

Note: Borders are handled correctly by Awesome, Qtile, and XMonad

peter@taffer ~ % i3-msg -t get_outputs | python -m json.tool
[
    {
        "name": "xroot-0",
        "active": false,
        "primary": false,
        "rect": {
            "x": 0,
            "y": 0,
            "width": 4800,
            "height": 1620
        },
        "current_workspace": null
    },
    {
        "name": "DP-0",
        "active": true,
        "primary": false,
        "rect": {
            "x": 0,
            "y": 0,
            "width": 2880,
            "height": 1620
        },
        "current_workspace": "1"
    },
    {
        "name": "DP-3",
        "active": false,
        "primary": false,
        "rect": {
            "x": 2880,
            "y": 0,
            "width": 1920,
            "height": 1080
        },
        "current_workspace": null
    }
]

Expected Behavior

After configuring borders/underscan, i3 would detect the correct display size that takes into account the borders and all windows and bars would fit on screen.

Reproduction Instructions

  1. Nvidia card and drivers
  • Driver version: 430.14
  • Card: GeForce GTX 970M
  1. Configure borders some borders. E.g.,:
$ nvidia-settings --assign CurrentMetaMode='2880x1620 @2684x1509 {ViewPortIn=2684x1509, ViewPortOut=2684x1509+98+55}'
  1. Look at the right and bottom of windows and bars

Environment

Output of i3 --moreversion 2>&-:

Binary i3 version:  4.16.1 (2019-01-27) © 2009 Michael Stapelberg and contributors
Running i3 version: 4.16.1 (2019-01-27) (pid 3792)abort…)
Loaded i3 config: /home/peter/.config/i3/config (Last modified: Mon 27 May 2019 13:42:56 BST, 665 seconds ago)

The i3 binary you just called: /usr/bin/i3
The i3 binary you are running: i3

Tested with no config, Arch stock, and custom. Exactly the same

Logfile URL: https://logs.i3wm.org/logs/5737248813219840.bz2
- Linux Distribution & Version: Arch Linux, updated
- Are you using a compositor (e.g., xcompmgr or compton): no

@i3bot i3bot added bug 4.16 labels May 27, 2019

@dj-foxxy

This comment has been minimized.

Copy link
Author

commented May 27, 2019

I'm just poking around, so might be completely wrong.

i3 gets it's output size from XCB monitor info: https://github.com/i3/i3/blob/next/src/randr.c#L549

Were as Qtile uses CRTC info:
https://github.com/qtile/qtile/blob/master/libqtile/core/xcbq.py#L367

I read in a StackOverflow (https://stackoverflow.com/questions/36966900/xcb-get-all-monitors-ands-their-x-y-coordinates) answer that monitor info relates to the physical display were as crtc relates to the effective output size.

I don't know enough about X, i3, or Qtile to know if I'm on to something.

@Airblader

This comment has been minimized.

Copy link
Member

commented May 27, 2019

We used to get it from the CRTC info as well, but this was changed with the move to support RandR 1.5. If this is the cause (I haven't verified, and also can't since I have no nVidia card) I wouldn't be sure what to do about it.

@dj-foxxy

This comment has been minimized.

Copy link
Author

commented May 27, 2019

@Airblader Just compiled a ran this example piece of software that uses CRTC info and it detected the correct output size. I think this might be the issue.

I'm going to try extend the example to have the monitor info side-by-side to see if there's a difference.

@dj-foxxy

This comment has been minimized.

Copy link
Author

commented May 27, 2019

It is does not occur if use RandR 1.4 instead of RandR 1.5 by running i3 with the --disable-randr15.

@Airblader

This comment has been minimized.

Copy link
Member

commented May 27, 2019

Thanks for testing that. The way I see it this is either a bug in RandR or intended behavior, in which case I don't think we can do much more than give you the flag to disable RandR 1.5.

It would be a good reason to keep this flag around, though. @stapelberg Do you have thoughts on this?

@stapelberg

This comment has been minimized.

Copy link
Member

commented May 27, 2019

This should be reported to nVidia as a bug—their driver should populate the RandR 1.5 structures with the correct info, too.

We can hold on to the flag until the bug is fixed.

@dj-foxxy

This comment has been minimized.

Copy link
Author

commented May 27, 2019

@Airblader No problem.

Please keep the flag around until the issue is resolved. We use i3's IPC extensively so I really don't want to port everything to another WM.

@stapelberg I'm don't know much about RandR, but I'm not sure this is a bug. CRTCs model the part of the total display being used and monitors model the total display size (of which some might not be used: underscan). When I did my tests, the values for each were correct, but i3 should be sizing itself on the part of monitor being used, not the total. I can't verify this because I can't find any documentation for XCB API (well, not any with descriptions).

@Airblader

This comment has been minimized.

Copy link
Member

commented May 27, 2019

Yeah, we won't be in a rush to remove the flag after this. I wouldn't mind keeping it altogether, we do also still have Xinerama support. ;-)

@stapelberg

This comment has been minimized.

Copy link
Member

commented May 27, 2019

@stapelberg I'm don't know much about RandR, but I'm not sure this is a bug. CRTCs model the part of the total display being used and monitors model the total display size (of which some might not be used: underscan). When I did my tests, the values for each were correct, but i3 should be sizing itself on the part of monitor being used, not the total. I can't verify this because I can't find any documentation for XCB API (well, not any with descriptions).

In my understanding, this is a bug. CRTCs (cathode ray tube controllers; consider how old that API is…) were usually synonymous with monitors, until 4K/8K/… monitors needed two panels in one chassis. The monitor API abstracts individual CRTCs away, so it’s only reasonable to assume that programs should only have to query that API, not both.

Please report this to nVidia and see what they say.

@dj-foxxy

This comment has been minimized.

Copy link
Author

commented May 27, 2019

@stapelberg Fair enough, I don't really know how the API works :).

I've posted the issue in the Linux forum on the Nvidia site. I'm not sure that's where it's meant to be but it's a start:

https://devtalk.nvidia.com/default/topic/1052682/linux/randr-monitor_info-wrong-width-amp-height-when-using-underscan/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.