Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Predicted Display/Vsync Timestamp #1211

Closed
Squareys opened this issue Jul 1, 2021 · 9 comments · Fixed by #1230
Closed

Predicted Display/Vsync Timestamp #1211

Squareys opened this issue Jul 1, 2021 · 9 comments · Fixed by #1230
Milestone

Comments

@Squareys
Copy link
Contributor

Squareys commented Jul 1, 2021

There has been discussion about what the timestamp passed to various callbacks should represent (#347 and #943) from which the consensus was to keep it consistent with the window rAF--as the predicted display/vsync time would be better communicated through the frame. (XRFrame.timing.predictedDisplayTime, for example)

@blairmacintyre last pinged on this issue, but the issue has since been closed, half resolved with only the consistent rAF part.

Though, neither the timestamp passed to the rAF, nor performance.now() are useful for the app to produce smooth animations, here's an attempt at explaining this:

Assume a programatically animated car driving a long a straight road, moving it by 1/90*speed every frame. If a frame drops, our car moves slower, the car is frame-rate dependent, so we use a delta time to the last frame instead: dt*speed means that we now run at some difference (our options are either rAF start timestamp or performance.now) that may be vsync intervals +/- up to one vsync interval, depending how regularly the browser schedules the rAF and whether we are hitting performance targets. This causes noticeable jutter, especially when not on perfect performance (which practially never happens, a frame will be dropped even then).
But effectively we are trying to produce the image for the next Vsync, a time that is pretty well predicted through VR drivers/SDKs already for the most part. We want a time in the future instead of trying to play catch-up with previous frames.

Vsync on the web, if I understand correctly, has been a headache for web games and graphics developers for quite some time now (w3c/html#785), there even are tools to measure how this is still not working well (https://www.vsynctester.com/), and I believe for WebXR it's probablye even more important to get this right, as the jutter is so much more perceivable on VR headsets.

CC @cabanier

@cabanier
Copy link
Member

cabanier commented Jul 2, 2021

Reading the github discussion you linked to and looking at the Oculus API, we should indeed be providing a predicted display time and almost everyone was on board.
@toji , do you remember why it didn't make it in the spec?
@thetuvix , I'm assuming the OpenXR has something similar?

@toji
Copy link
Member

toji commented Jul 20, 2021

It's primarily been concerns about making sure that whatever value we surface is: a) universal and b) useful. I think the latter is the larger sticking point. Syncing the headset-provided numbers up with the values that you can get today in JavaScript may be difficult, and ensuring developers recognize what those numbers actually mean is tricky. For example, it would be really easy to look at a predictedDisplayTime number and say, "Oh, so if I just do predictedDisplayTime - performance.now() that tells me how much time I can spend rendering." and that's definitely not the case.

That said, I have no qualms about adding such a value if we can demonstrate a clear way for developers to effectively utilize it.

@cabanier
Copy link
Member

This value is definitely used in native code to provide smooth animations.
It seems that it would be an error from that developer if they assume that they can spend that much time on rendering.

@Squareys
Copy link
Contributor Author

It seems that it would be an error from that developer if they assume that they can spend that much time on rendering.

Definitely agree. I hope that developers are roughly aware that quite a bit of (uncertain amounts of) time is required between submit and vsync.

demonstrate a clear way for developers to effectively utilize it.

Smooth animations, as pointed out by @rcabanier, or smoother physics updates are the most important use case here. This timestamp gives "the point in time at which the frame will be shown", ergo, the time at which animations should be.

I wonder if timestamp - performance.now() could, while not allow infering how much time is left for rendering, allow inferring that the current frame is going to be dropped, e.g. if negative or below a small value.

Is there something I can do to move this forward? E.g. sketch out a possible example for immersive-web-samples?

@cabanier
Copy link
Member

If you make a proposal, the oculus browser could ship it as an experimental API.
If you could then demonstrate how it improves the experience, you will have a strong argument to have it as part of the specification.

@thetuvix
Copy link
Contributor

In OpenXR, there is an explicitly separate notion of "time" and "system time":

  • "Time" is represented by XrTime and is an XR-specific notion defined by a given runtime. This is explicitly decoupled from the system's clock to account for clock drift, especially when the XR display is a separate device connected, perhaps wirelessly, to the host running the app.
  • "System time" maps most closely to things like performance.now(). This is QueryPerformanceCounter time on Windows and timespec monotonic time on Unix. System time is never directly returned by any functions in OpenXR, except those linked here that explicitly convert from XRTime to/from a given system time representation.

I encourage any XR time representation we use here to allow for that same decoupling of the frame time for the XR display from the time on the host running the UA and its JavaScript threads.

We could then consider further APIs to convert those XR times to/from other web time domains (e.g. that match performance.now() or such), if we find we need them. However, I strongly discourage having the core WebXR API frame timing API directly expose times that are already converted into familiar web time domains, as that will encourage web engines to just use that time domain for managing their WebXR frame timing, defeating the runtime's clock drift adjustments, which will reduce rendering quality. For example, when ensuring smooth animations, it is correct for WebXR engines to diff a time derived from OpenXR's XrTime, but not a time converted to OpenXR's "system time".

For example, it would be really easy to look at a predictedDisplayTime number and say, "Oh, so if I just do predictedDisplayTime - performance.now() that tells me how much time I can spend rendering." and that's definitely not the case.

I would consider not using DOMHighResTimeStamp to represent XR time, and to explicitly disallow a UA from defining XR time using the same time domain/epoch as things like performance.now(), specifically to discourage patterns like that which could sort of work on one device and then fail on another. At minimum, we should have a SHOULD NOT there.

@cabanier
Copy link
Member

I would consider not using DOMHighResTimeStamp to represent XR time, and to explicitly disallow a UA from defining XR time using the same time domain/epoch as things like performance.now(), specifically to discourage patterns like that which could sort of work on one device and then fail on another. At minimum, we should have a SHOULD NOT there.

I think that's really all we can do here. If author choose to abuse this API, there's not much we can do about it...
At least for Quest, if they assume that they can be in the Raf until display time, they are going to miss each frame.

@cabanier
Copy link
Member

@Squareys if you want to discuss this during a meeting you can tag this issue with /agenda

@Manishearth Manishearth added this to the Pre-CR milestone Jul 27, 2021
@Manishearth
Copy link
Contributor

/agenda To discuss this at the next meeting

@Squareys would you be willing to drive the discussion for this? @AdaRoseCannon can send you an invite for the meeting. Otherwise I guess Rik or Brandon can drive this.

@probot-label probot-label bot added the agenda Request discussion in the next telecon/FTF label Jul 27, 2021
@AdaRoseCannon AdaRoseCannon removed the agenda Request discussion in the next telecon/FTF label Aug 10, 2021
@Manishearth Manishearth linked a pull request Oct 5, 2021 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants