Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discussion: Resampling of mouse event series to match framerate #1730

Open
MMulthaupt opened this issue Jul 11, 2020 · 1 comment
Open

Discussion: Resampling of mouse event series to match framerate #1730

MMulthaupt opened this issue Jul 11, 2020 · 1 comment

Comments

@MMulthaupt
Copy link

The mouse driver reports movement deltas in set intervals of n milliseconds, leading to poll rates such as 125 Hz (8 ms interval) or 500 Hz (2 ms interval). (Source) Unless this poll rate perfectly coincides with the frame rate of the application (which practically never occurs), frames won't always have the same time quantity of mouse movement information available. Depending on how the application uses the mouse deltas, this can lead to visual stuttering, e.g. when mouse deltas are used to change the pitch and yaw of the view in a first person game. That is a problem.

The idealistic solution to this problem is to delay processing/"consumption" of mouse deltas by the same amount of time as the poll interval, interpolating as required. (A real-world application would probably want to make this optional to the user in regard to the presence of response-time marketing ploys) E.g. with a poll rate of 125 Hz and a frame rate of 120 Hz, we'd want to consume 100%*125/120 = 104.167% worth of mouse deltas per frame. The excess delta received every 120/(125-120) = 24 frames would allow for this.

There are, however, some obstacles to this approach:

  1. There is no information available on when a mouse event has arrived. We simply receive a quantity of >= 0 events whenever we poll for events.
  2. Because no mouse move event is triggered when there is no mouse movement, neither the poll rate, nor the phase offset of an ongoing mouse event series can be determined automatically.

I realize some might consider this as the territory of over-engineering. I'd still be interested to hear any thoughts on what could be done to solve the problem though, maybe even considering the approach presented above.

@ezdiy
Copy link

ezdiy commented Jul 15, 2020

Because no mouse move event is triggered when there is no mouse movement, neither the poll rate, nor the phase offset of an ongoing mouse event series can be determined automatically.

The thing is that this is driver and OS dependent. When you sleep in vsync (or crunch due to busy thread of renderer being actually busy), a mouse tick still comes in.You may - or may not - see it during next poll. Edge trigger vs level trigger. As far I've seen, edge triggers in single threaded renderers is generally what tends to generate stutter in practice, not the presence of phase-desynch as such.

Generally the solution to this is to do the maximum possible to consume input in separate thread, so as to ensure the renderer sees level, not just edges. There, you can also apply relevant smoothing and phase lock if you really want too, but the jitter one is dealing with is so miniscule it seems of dubious benefit to me.

As for the context of glfw, this would mean investigating support for split-threaded (input one, gl/vk/dx another thread) context. Which is pretty messy stuff. Some approaches opt to keep the main thread dumb and just pulling command list from queue that's fed by separate renderer thread that's CPU heavy. Or just scrap all that legacy, and use VK where the coupling to context is not as ill-defined (separate threads are free to feed the I/O context directly).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants