Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance tip for vsync on proprietary nvidia drivers #227

Open
bucaneer opened this issue Aug 17, 2014 · 32 comments
Open

Performance tip for vsync on proprietary nvidia drivers #227

bucaneer opened this issue Aug 17, 2014 · 32 comments

Comments

@bucaneer
Copy link

I struggled for a while trying to get a perfectly tearing-free display with the help of compton (#168), but just now I found a way to do that with nvidia drivers alone. It is a single line to be added to xorg.conf under Section "Screen":

Option "metamodes" "nvidia-auto-select +0+0 { ForceFullCompositionPipeline = On }"

This removes all tearing even when no compositor is running at virtually no performance cost. It means that compton itself can go back to simply handling transparency and shadows on its lean --backend xrender without wasting resources on vsync.

The source claims the option is only available on 600 series and newer Nvidia cards, but I can't see anything to that effect in the driver manual, so I assume it should work on all cards covered by drivers since version 319.23 (basically, 8000 series and newer).

I did some benchmarking with gtkperf. The table shows total time (in seconds) for a 100 round test:

Old compton config New compton config No compton
ForceFullCompositionPipeline = Off 2.76 1.55 1.67
ForceFullCompositionPipeline = On 2.73 1.57 1.72

where old compton config is this:

compton -cCzG -t-3 -l-5 -r4 \
 --config /dev/null --backend xr_glx_hybrid \
 --vsync opengl-swc --vsync-use-glfinish \
 --glx-no-stencil --paint-on-overlay \
 --glx-swap-method 3 --glx-no-rebind-pixmap \
 --xrender-sync --xrender-sync-fence \
 --unredir-if-possible

and new config is this:

compton -cCzG -t-3 -l-5 -r4 \
 --config /dev/null --backend xrender \
 --unredir-if-possible

Only the old config is tear-free when ForceFullCompositionPipeline is off, but all three are when it is on. ForceFullCompositionPipeline itself does not really have any performance impact beyond the margin of error, but getting rid of newly redundant compton options provides a significant speed-up. I think it would be good to mention this in the vsync and/or performance guide. (Yes, the vsync guide does say that compton can't perform better than the drivers, but this particular option is hidden from regular users, and the "Sync to VBlank" option in nvidia-settings is limited to OpenGL applications, and actually doesn't perform well with compton --backend glx.)

@japanese-bird
Copy link

I'm a new user of compton who was also looking for a way to get rid of tearing with the propreitary nvidia-drivers, I tried the option you mentioned, and it completely worked. Really glad I happen to have found your post, I've never seen that mentioned anywhere through google searches.

Compton fixed tearing for me, my only problem was that it caused the color and detail of video to be slightly off, but that's another discussion, thank you.

@richardgv
Copy link
Collaborator

Hi, bucaneer and japanese-bird,

Sorry for the late reply, firstly.

Thanks for the new tip! I've added it the VSync guide, and it would be greatly helpful for the nvidia-driver users struggling with tearing issues, I suppose. :-)

As a side note, it's indicated that the option causes huge (~30%) performance loss on some OpenGL applications: https://devtalk.nvidia.com/default/topic/602831/linux/unrecognized-flatpanelproperties-property-quot-scaling-quot-/post/3941157/#3941157

(I've not tested it myself because I don't have any tearing issues.)

@japanese-bird:

Compton fixed tearing for me, my only problem was that it caused the color and detail of video to be slightly off, but that's another discussion, thank you.

Hmm...

  1. It happens with all backends, xrender and glx?
  2. You could check if you have opacity, dimming, or custom shaders applied on the window.
  3. Some drivers could apply special effects on application output. For example, forcing anti-aliasing in your driver may cause the output of compton to become blurry.
  4. Trying different video output in your video player might bring something interesting.
  5. Flipping --paint-on-overlay may change something.

@bucaneer
Copy link
Author

@richardgv I can't replicate the massive OpenGL performance loss on my system. There is some loss, but largely negligible. Unigine Heaven loses 0.8% (20.16 fps vs. 20.00 fps), Unigine Valley - 1.9% (30.08 fps vs. 29.49 fps) and Portal - 4.7% (193.24 fps vs. 184.19 fps). All tests were run through Phoronix test suite, results averaged over three runs. In any case, this is still better than what I got with the previous compton-based solution (too lazy to benchmark for hard numbers, but intuitively obvious).

@tserhii-gh
Copy link

@bucaneer thx for the tip, works perfect on gt240 with 331.89 with compton glx and xrender (kwin, compiz, mutter)

@actionless
Copy link

confirming on dell e6400 (NVIDIA® Quadro® NVS 160M), driver version 340.32

@bucaneer
Copy link
Author

Update on performance testing - this clashes nastily with OpenGL VSync if it is enabled in driver configuration or in specific application settings - heavy stuttering, low FPS. Perhaps that was what the poster in nvidia forums saw?

@richardgv
Copy link
Collaborator

Well, I can't reproduce the ~30% drop in performance, either...

ForceFullCompositionPipeline compton gtkperf glxgears FPS Unigine Heaven FPS
1.43 35000 62.5
xrender 1.33 13000 61.7
glx 4.23 10000 61.4
On 1.51 24000 62.2
On xrender 1.42 14000 61.8
On glx 2.62 24000 60.8

(My setup: Single monitor; GTX 670; nvidia-drivers-343.22; compton with compton.sample.conf, and --backend glx --paint-on-overlay --blur-background --glx-no-stencil --glx-no-rebind-pixmap --glx-swap-method buffer-age --xrender-sync-fence for GLX backend; compton-git-v0.1_beta2-47-g8c88b4d-dirty-2014-09-07; gtkperf-0.40; Unigine Heaven 4.0 Free, 1024x768 windowed, extreme, 8xMSAA)

The effect on FPS of glxgears is... Interesting.

And I didn't the notice clash between "Sync to VBlank" in nvidia-drivers and ForceFullCompositionPipeline.

@ubuntuaddicted
Copy link

I'm curious to know how to use this within my xorg.conf when my metamode doesn't look like your example. My entire "Device" section looks like this:

Section "Device"
Identifier "Card0"
Driver "nvidia"
VendorName "NVIDIA Corporation"

#refer to the link below for more information on each of the following options.
Option         "MetaModes"          "1920x1080, 1680x1050"
Option         "ConnectedMonitor"   "DFP-3, CRT-0"
Option         "MetaModeOrientation" "DFP-3 RightOf CRT-0"

EndSection

I have a dual monitor setup using a GTX 760 and Nvidia binary 343.22. I've had a hell of a time getting my right most monitor be registered as the primary monitor so that games launch on it vs them launching on the left most monitor which is not what I want. My window manager is XFCE, i'm using Xubuntu 14.04 but with kernel 3.16.

On a side note, when I view nvidia-settings the checkbox for "make this the primary monitor for the x screen" is always on the left most monitor when I boot into the system, running nvidia-settings with gksudo or sudo and changing the checkbox to be the right most monitor and saving it to my xorg.conf still results in the wrong monitor being labeled as the primary after I reboot and it also changed the way my xorg.conf looks, it changes the device section to look more like yours BUT then games launch on the wrong monitor (this is what xorg.conf look like that launches games on the wrong monitor: http://pastebin.com/eHAcQnY4)

so to get them to launch on the right most monitor I made my xorg.conf look like this: http://pastebin.com/uSA3La2h

To recap, I want a tear free desktop experience, I should be able to with a GTX 760 and 2 very capable monitors. I'm using compton as well by the way, i launch it using:
#!/bin/bash
seq 0 3 | xargs -l1 -I@ compton --config /home/ubu/.config/compton.conf -b -d :0.@

and it's .conf file is here: http://pastebin.com/AUv8FZ8Z

Any help to get a tear free desktop would be much appreciated.

@bucaneer
Copy link
Author

@ubuntuaddicted You should consult the driver manual about this, but if I'm reading it right I think you can make it work by changing this line:

Option         "MetaModes"          "1920x1080, 1680x1050"

to this:

Option         "MetaModes"          "1920x1080 { ForceFullCompositionPipeline = On }, 1680x1050 { ForceFullCompositionPipeline = On }"

@richardgv
Copy link
Collaborator

@ubuntuaddicted:

The MetaModes question is already answered by bucaneer.

I don't understand what you were trying to express by describing how you managed to make the games run on the correct monitor, unfortunately.

If you still have VSync problems after enabling ForceFullCompositionPipeline with compton, additional advices are available in the vsync guide.

By the way, doesn't TwinView create a single X screen, so that it should no longer be needed to launch multiple compton processes (one for each screen)?

@mmortal03
Copy link

I wish I could get this to work for me. I'm experiencing tearing, for instance, in Firefox when vertically scrolling (with smooth scrolling enabled). I have an Nvidia GeForce GT 640M (Optimus) in my laptop, and I have no VSync option in the NVIDIA X Server Settings. So, I gave the above a shot, and it didn't work. Luckily, I can just run my Intel graphics with compton and solve the tearing issue, but I was hoping that I could get it to work on both graphics chips.

@Nightbane112
Copy link

@mmortal03 If you're using Nvidia proprietary drivers, which software are you using? nvidia-prime or bumblebee?

If you're using nvidia-prime, you're in bad luck. Nvidia-prime will cause tearing
https://devtalk.nvidia.com/default/topic/775691/linux/vsync-issue-nvidia-prime-ux32vd-with-gt620-m-/
https://bugs.launchpad.net/ubuntu/+source/nvidia-prime/+bug/1260128

If you're using bumblebee, just use the Primus backend as it uses the intel GPU to vsync everything the Nvidia GPU outputs

azuwis added a commit to azuwis/ansible-pc that referenced this issue May 4, 2015
@ElTimablo
Copy link

This does nothing for me on a GTX 770 in Debian Testing with the latest 352.30 driver. Tearing is still rampant and checking "Sync to VBlank" in nvidia-settings does nothing. Ideas?

@ioquatix
Copy link

I'm using the latest version of compton (arch linux compton-git from AUR) and everything is okay until I have a window with OpenGL content (e.g. glxgears). Moving windows around on the screen, previously fine, becomes very slow and choppy. Any ideas what I should do to fix this?

                       system         Computer
/0                     bus            Motherboard
/0/0                   memory         15GiB System memory
/0/1                   processor      Intel(R) Core(TM) i7-4770 CPU @ 3.40GHz
/0/100                 bridge         4th Gen Core Processor DRAM Controller
/0/100/1               bridge         Xeon E3-1200 v3/4th Gen Core Processor PCI Express x16 Controller
/0/100/1/0             display        GK104 [GeForce GTX 770]
/0/100/1/0.1           multimedia     GK104 HDMI Audio Controller

@ioquatix
Copy link

@ElTimablo I appear to have the same problem. Using XRender backend with a GTX 770 and this metamodes option still yields screen tearing.

@ElTimablo
Copy link

@ioquatix Interesting. I'm getting this on one monitor out of two while running gnome. The other monitor is fine, and the only real difference between them is that one is HDMI and the other is DVI. I'm not getting it on my 970, which has both monitors on DVI, so I wonder if the different cables have anything to do with it.

@actionless
Copy link

is that possible to sync one video board to more than one display?

@ElTimablo
Copy link

@actionless No, but I'm thinking that there might be some kind of sync between the two DVI ports that just isn't there between the DVI and HDMI ports. I'm getting another DVI cable in the mail on Monday, so I'll try it out then.

@ElTimablo
Copy link

So I have both screens of my dual-screen setup connected via DVI, and the tearing that was previously present on only one monitor is now gone. Just an FYI for anyone out there using an HDMI cable and a DVI cable. With this and the ForceFullCompositePipeline option set to on, tearing is pretty much gone for me.

@actionless
Copy link

are displays the same model?

@ElTimablo
Copy link

@actionless They're the same manufacturer, but not the same model. One is an older Acer LCD, and the other one is an Acer LED. They are both the same resolution and similar sizes, however. It was also the case on my personal desktop (the one I was testing on is my girlfriend's), which has two identical displays.

TL;DR: If you have two displays, hook them both up the same way.

@eazar001
Copy link

@bucaneer , you're an absolute champion for sharing this information. It worked perfectly with the Nvidia GT730 on my system. Thanks.

@CamilleScholtz
Copy link

GT570 reporting in, works like a charm, thanks!

@sunnyps
Copy link

sunnyps commented Apr 16, 2016

The ForceFullCompositionPipeline option regresses applications that synchronize to vsync on their own. E.g. try running a single window of Chrome/Chromium on vsynctester.com with and without that option and you'll see what I'm talking about. It's probably ok for games, benchmark applications and non-hardware accelerated media players which usually try not to do vsync on their own.

@Brottweiler
Copy link
Contributor

Brottweiler commented Aug 24, 2016

I've been messing around with screen tearing and will write some information.

compton with glx backend and opengl-swc vsync fixes all screen tearing everywhere. The problem with this is that when having OpenGL windows like a game (minecraft) open, or playing a video (with mpv, or youtube, or twitch...), interacting with windows is laggy, stuttery and slow. Alt+tabbing, moving, resizing windows is very laggy and not smooth...

Using xrender fixes all that, but you got screen tearing then. Using ForceFullCompositionPipeline seems to fix screen tearing, and works very well, except in games. In minecraft, there's occasional stutter, as if the framerate went down and is inconsistent, but the FPS never changes, it's always stable 60 like normally.

Today, after upgrading nvidia drivers (see below) using ForceFullCompositionPipeline makes some windows lag horribly when moving... If all I have is a terminal, then it's smooth. As soon as I start mpv or a browser, moving windows is extremely laggy. With compton running, only my cursor moves, and when I stop, the window moves there. Without compton, I can see that the window is moving, but very slow, like it lags behind my cursor.

Not sure if nvidia update is related or not, but when I remove ForceFullCompositionPipeline it's all good, although I have screen tearing like mad.

[2016-08-24 13:02] [ALPM] upgraded nvidia-libgl (367.35-1 -> 370.23-1)
[2016-08-24 13:02] [ALPM] upgraded libevdev (1.5.2-1 -> 1.5.3-1)
[2016-08-24 13:02] [ALPM] upgraded nvidia-utils (367.35-1 -> 370.23-1)
[2016-08-24 13:02] [ALPM] upgraded lib32-nvidia-utils (367.35-1 -> 370.23-1)
[2016-08-24 13:02] [ALPM] upgraded lib32-nvidia-libgl (367.35-1 -> 370.23-1)
[2016-08-24 13:02] [ALPM] upgraded libxnvctrl (367.35-1 -> 370.23-1)
[2016-08-24 13:02] [ALPM] upgraded nvidia (367.35-2 -> 370.23-1)
[2016-08-24 13:02] [ALPM] upgraded nvidia-ck-haswell (367.35-4 -> 370.23-1)
[2016-08-24 13:02] [ALPM] upgraded nvidia-settings (367.35-1 -> 370.23-1)

This is also what I use for 20-nvidia.conf.

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "metamodes" "nvidia-auto-select +0+0 { ForceFullCompositionPipeline = On }"
    Option         "TripleBuffer" "on"
    Option         "AllowIndirectGLXProtocol" "off"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection

So, to conclude:

  • compton, with glx and opengl-swc: lag
  • compton, with xrender: smooth, but with tearing
  • compton, with xrender and ForceFullCompositionPipeline:
    • then: smooth, no tearing, laggy games.
    • now: extreme lag after opening browser or mpv, probably no tearing.

Edit: Actually, the part where ForceFullCompositionPipeline makes windows lag seems to be inconsistant. After opening nvidia-settings, I have no issues. Then I was going to try and see if redshift or compton was the issue, and kill them, but this time I didn't have any issues to begin with...

Oh, by the way, Nvidia's "Sync to vblank" in nvidia-settings literally does nothing to solve screen tearing in OpenGL windows. It does not work.

@Brottweiler
Copy link
Contributor

Brottweiler commented Aug 26, 2016

Update on the above...

Using xrender and none vsync with compton, and using this x config, I had slowdowns in chrome or video... and if I just held down a key to repeat in a terminal, it was stuttery.

However, this command works, and if I compare the metamodes using nvidia-settings -q CurrentMetaMode, the below is the difference.

Starting X after X config file

Attribute 'CurrentMetaMode' (noname:0.0): id=50, switchable=yes, source=xconfig :: DPY-1: nvidia-auto-select @1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0, ForceCompositionPipeline=On, ForceFullCompositionPipeline=On}

After running command

Attribute 'CurrentMetaMode' (noname:0.0): id=50, switchable=no, source=nv-control :: DPY-1: nvidia-auto-select @1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0, ForceCompositionPipeline=On, ForceFullCompositionPipeline=On}

For some reason switchable is different, no idea why, and what difference that makes. The command I use above is this one:

nvidia-settings --assign "CurrentMetaMode=DPY-1: nvidia-auto-select @1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0, ForceFullCompositionPipeline=On}"

Now, what is the difference between the command and the X config? I made an image to show you: http://i.imgur.com/DqRjixY.png

I simply tried to copy paste the nvidia-auto-select @1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0, ForceFullCompositionPipeline=On} into the X config file, resulting in this:

Section "Screen"
    Identifier     "Screen0"
    Option         "metamodes" "nvidia-auto-select @1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0, ForceFullCompositionPipeline=On}"
    Option         "AllowIndirectGLXProtocol" "off"
    Option         "TripleBuffer" "on"
EndSection

For what it's worth, in nvidia-settings, I have Sync to Vblank enabled, and Flipping disabled.

Also, it appears that the issue where ForceFullCompositionPipeline would make games stutter, doesn't happen in 370.23.

@Brottweiler
Copy link
Contributor

Brottweiler commented Aug 27, 2016

Seems that I celebrated a little too soon, today after a reboot the same issue (slowdowns, stuttery terminal) still seems to happen. It seems to just be random. Sometimes when I start X, it's fine, sometimes I get glitchy stuff happening as already described. The difference with the X config and the command is that switchable is yes with X config, and no with command.

Anyone have any idea what this switchable thing is, and how I can set it to no via the X config? Actually, it might not matter, since now when everything is fine, it's still set to yes.

@licaon-kter
Copy link

switchable might be related to the Optimus Intel+nVidia laptop stuff.

@Brottweiler
Copy link
Contributor

Yeah, but I find no info about how to set that via 20-nvidia.conf. I tried using an Option and adding it to metamodes but neither works. As I said, it might not even matter since right now when everything is fine, it's still set to yes.

@danilw
Copy link

danilw commented Feb 14, 2017

I update this with my info
I record this videos with tests https://www.youtube.com/playlist?list=PLzDEnfuEGFHvqKPwXFUi_DPsDvSleldx6 all on 1080 and 60fps

on all videos with "On" option you can see "bad framerate" compare to same "FPS counter framerate" on video with "Off" option

as result- I understand that option "drop" all frames with tearing, so when FPS in you browser games video player go lower then monitor "phisical hertz" you got many frames with tearing and they droped...

in games-without vsync games render all frames with tearing, when this option set to "On" you can see 90FPS in game and like 10-20 "visual FPS" you can see it on this video (option On) https://www.youtube.com/watch?v=KiC2X1C1hZA&list=PLzDEnfuEGFHvqKPwXFUi_DPsDvSleldx6&index=9 compare to "Off" https://www.youtube.com/watch?v=8-Fy91tKHWY&list=PLzDEnfuEGFHvqKPwXFUi_DPsDvSleldx6&index=10

also as "side effect" I got "input freez"(keyboard/mouse) during frame drops(when option is On)
I can not record this "input freez effect" but I record example when I got it
https://www.youtube.com/watch?v=svhBMcLx6-c&list=PLzDEnfuEGFHvqKPwXFUi_DPsDvSleldx6&index=6
this is frames of the fifth video (in playlist)
you can read time under video(on youtube) 00:02 00:07 00:32 00:52 1:49
its time when you can see "screen tearing"
you ask-why? why tearing when I record it with "On" option
I tell you- I do not see it on the monitor, this frames with tearing are "droped" by option "On" and during this "drop" all Input is freezed(its like 0.1 sec or less) but "ffmpeg" continue recording on this(0.1sec time) "drop" and ffmpeg get frame from system memory-and there is "tearing"-frame but this frame do not display for "me"(on monitor) its only in computer memory....I hope you understand
this make all "active" games very hard to play because even with 60FPS+ and vsync this freezes still happens

I test this option for more then year (I even use it for 1 week...very bad GUI/xorg reactions on input its main problem for me) ,it keep same its not my hardware or kernel or xorg problem, I can see many users on internet post same result as I did, even this thread on github confirm it

so I still play games with "vsync" and ForceCompositionPipeline = Off (even with tearing(sometime,its not always happens...random) its better and smooth then option "On")
and acept tearing on all other UI in system-webbrowser file manager text editor even moving windows with tearing still better then "drop tearing frames"

Yes compositors like Compton with vsync make it better....but tearing still exist

also I add "perfect test" for tearing without games
http://codepen.io/anon/pen/rjoGwB or http://liveweave.com/67C4N2 (both link contain same javascript-slider) save it local or open live from web
Open two or more windows(not tabs) of webbrowser and launch this on maximised webbrowser-window (two or more windows)
youl see tearing when ForceCompositionPipeline = Off or youl see "tearing drops" when this otion is "On"

raku-cat referenced this issue in DelusionalLogic/compton Apr 11, 2018
I don't want to support XRender, so lets just cut all the shit. Welcome
to NeoComp
@mikkorantalainen
Copy link

I ended up putting just

nvidia-settings --assign "CurrentMetaMode=nvidia-auto-select +0+0 {ForceFullCompositionPipeline=On}"

at the end of .xsessionrc. This way I don't need to do any system-wide configuration and I don't need to hardcode my output or resolution.

@ricardoscotia
Copy link

FWIW this xorg line addition resolved the issue for me. NVIDIA Geforce 1050 on Debian 9 using Mate/Compton (using GPU compositor). Symptons were browser being much slower, typing sometimes stalling in browser, computer generally running hotter.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests