Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature request: possible to limit number of prerendered frames? #536

Closed
aufkrawall opened this issue Aug 7, 2018 · 10 comments
Closed

feature request: possible to limit number of prerendered frames? #536

aufkrawall opened this issue Aug 7, 2018 · 10 comments

Comments

@aufkrawall
Copy link

Hello,
I noticed that radv shows a lower input latency than proprietary drivers of both AMD and Nvidia in some games. I suspect radv limits the number of frames that are allowed to be rendered ahead, which should be a good thing as the AMD DX11/9 driver on Windows does the same (Nvidia DX11/9/OGL can be configured to do so).

Would it be possible for dxvk to limit the amount of frames rendered ahead to one to reduce input latency also with the Nvidia driver?

@doitsujin
Copy link
Owner

doitsujin commented Aug 7, 2018

This is already possible with the dxgi.maxFrameLatency knob as explained on the Wiki page for the new configuration file. You'll need to run latest master though as the changes have landed only today.

@aufkrawall
Copy link
Author

Thanks for pointing me to it and sorry for the inconvenience.

@MartinPL
Copy link
Contributor

MartinPL commented Aug 8, 2018

Have you observed any positive changes related to this?

@aufkrawall
Copy link
Author

After trying out a value of 1 in Hitman, I don't think I noticed an effect. Feels quite laggy compared to Windows DX11 prerenderlimit 1 or DX12.

@doitsujin
Copy link
Owner

doitsujin commented Aug 8, 2018

What's your GPU? I'm hearing a lot of complaints about exceptionally poor performance on Nvidia and Vega GPUs.

@aufkrawall
Copy link
Author

I'm with a GTX 1070 Ti and a few days ago, I had a RX 560 installed for testing purposes.

With Nvidia, I e.g. get like ~70fps in an exemplary scene in Hitman in 1440p. After shader compile stuttering is done, it runs okayishly, but the input latency is quite high. I don't notice a difference between dxgi.maxFrameLatency = 1 or = 3, unlike on Windows with DX11 via driver prerender limit. With radv and the RX 560, the input latency was always low without further adjustments.
With Nvidia, I tried storing dxvk.conf in the game's folder and also setting its path via global environment variable, didn't make a difference.

@aufkrawall
Copy link
Author

The mouse jitter issue of wine-staging is really nasty, it clearly makes things worse in Hitman over regular wine. But subjectively I'd still say that the latter doesn't feel as direct as on Windows with similar fps (I got my window compositor disabled).
Though it's very playable at least, much better choice than the OGL port.

@doitsujin
Copy link
Owner

You can build wine-staging without the server-send_hardware_message, which should solve the mouse issue.

@aufkrawall
Copy link
Author

Yeah, I saw that, but thanks anyway. :)

Is there anything known about a problem with mouse sensitivity changing with high fps? I already noted this a year or so ago in D3D9 (e.g. Counter-Strike Source or Global Offensive) games: When the fps get too high, the mouse sensitivity suddenly increases a lot.
I could also provoke this in Hitman in 720p: When I look in the sky and the fps go up, so does the sensitivity. I haven't yet analyzed what's the magic fps threshold for this to happen, or if there maybe is any connection to the display refreshrate.

@aufkrawall
Copy link
Author

My mouse input issues are gone with wine-staging 3.15, some other users could also already confirm it:
ValveSoftware/Proton#147 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants