New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow session to specify refresh rate #1193
Comments
To recap some of the conversation that we had at the end of todays IW call: Modern hardware can pick and choose from multiple framerates to run at, so "High frame rate" is not particularly meaningful anymore. (On a Quest 2, for example, it could mean 72Hz, 90Hz, or 120Hz) It's hard for content to know what rate is optimal given the range of hardware a page may be run on. It would be ideal if authors could indicate a framerate switch mid-session. Oculus hardware can do this, and it's not expensive, Not sure if other hardware can as well. A critical case is for smoothness of video playback, in that developers want to choose a framerate that is an even multiple of their media's frame rate. Could specify desired rate as "I want a multiple of X" if we're concerned about exposing actual framerates. But Jeff Gilbert at least stated that he'd be okay exposing a list of supported rates once a session is started, given the user consent requirements to get to that point. Multiple people agreed with that sentiment. Then we ran out of time. |
Thanks for the writeup!
These should likely be on an |
We should definitely have some way to know what the duration of the current frame is - that's a simple oversight we should correct no matter what we do here. I could see us adding explicit framerate enumeration and setting - however, I am concerned about compat in the logic apps write to make these selections. Note that for similar mode negotiations for camera sensors, there is a more at-a-distance request the app makes where it prioritizes what factors matter to the app - that feels more similar to the "prefer multiple of X (e.g. 24)" approach. I am curious to hear what use cases folks have in mind beyond aligning to a multiple of the video framerate, which makes sense to me. If this is primarily about performance tuning, do we expect apps to smoothly move up and down the chain of framerates when they find themselves missing frames? Do we have enough performance timing APIs that apps can access in WebGL to know if they are well over or under their allotted frame times? Basically, I'd love to see an example outside of the video-matching example of how someone would use such an API in practice. I suspect in practice we would see:
If there are many other such good scenarios, we could decide to accept sites hardcoding device-specific framerates as the cost of doing business - however, if video playback is the only real-world scenario we're lighting up here, we may get a better result here with a more targeted API. |
For the purposes of media frame rate matching I generally prefer a "prefer multiple of X" approach I think, but the immediate problem that comes to mind is developers saying "I prefer a multiple of 120" in order to provoke higher framerates, and then check the reported or observed framerate to see if it worked. If we allow that then we might as well simply let them pick from a list of available rates. |
Looking at how this is used on the Oculus platform, afaik we do the dynamic scaling to match media framerate.
I agree that that's a risk but we're there today anyway where people just optimize for the most popular device :-( |
I agree with this. So, let's provide a method to get the list of available refresh rates and then a setter. |
If that is the case, it is a fingerprinting concern. Dynamic (during session) switching definitely would be great, as there are many cases where it could be usefull. |
Yes, that would be the problem. If these devices exist, we would need another property that can be set at creation time. Maybe that could be the "framerate multiply" for media. |
As a first pass, let's add a In a second pass, we could add support to report what frequencies are supported. |
What is the primary key scenario we're aiming to enable here? Is it aligning framerate to match video? If so, setting a framerate directly with no enumeration seems like it will result in the wrong behavior:
Introducing a framerate setter without enumeration seems worse than doing nothing before we figure out framerate enumeration, since we will have lots of content start to hardcode framerates for divergent purposes without any means to do the right thing. To the fingerprinting concern, perhaps there are two separate features here?
We could start with the dynamic version for now if it solves more scenarios folks are actually chasing down and see if there is a need on some hardware for the before-session version. |
Aligning video frame rate is certainly important but the more immediate problem is managing performance. Some users have very simple scenes and want to be able to show higher frame rates while others have complex scenes and would rather lower (or dynamically lower) the frame rate to avoid falling behind which will cause the compositor to duplicate frames.
I'm unsure if we should do provide support for this. Authors can query/set the frame rate after the session is created.
Do you have a proposal? I guess it could be a simple list of values... Also, I believe the editors have mentioned that they want to go to CR so don't want changes to the spec anymore. I'm fine to put this in a level 2 version. |
proposal:
frameRate -> set/return the current frame rate |
|
Do you mean that it would be lower in case the system can't keep up? |
Also, framerate reflects the refresh rate of the compositor. That one will always make frame rate; even if the experience can't keep up. |
One note here is that not all systems will be able to synchronously switch from one frame duration to another. Even if the system can switch instantly, it's likely too late for the frame that is current when you might set a The model I'd imagined here is more like partial interface XRFrame {
attribute double duration; // displayInterval?
}
partial interface XRSession {
undefined requestFrameRate(unsigned short? frameRate);
readonly attribute Uint16Array? supportedFrameRates;
} Expressing the effective frame interval as a Note that the UA is already allowed to adjust the frame interval as needed even without any new app control here. If the app wants to keep its animations correct in the face of those changes, it should already be subtracting the |
I didn't write the prose yet but I was envisioning that it'd be similar to
Wouldn't there be a danger that authors would assume that they are allowed to take that long to execute the frame? Don't we also need an attribute that returns the current frame rate or do you think that's not needed?
Are there any UA's that dynamically adjust the framerate? |
If there are 2 modes of frame rate: specific provided and UA driven, this might lead to a bad experience. Ability to specify "preferred target framerate" - would be instructing platform that let's say 120, but it is up to a platform to decide if it can manage such framerate or not.
I would assume that platforms currently decide by their means on what target frame rate is best for the browser. This spec can provide a way for developers to specify "preferred target" from a list of available ones. But the way that frame rate is managed - should be still dependant on UA and their platform specifics. Also, setting EDIT: EDIT2: |
I'm fine calling it
I'm unaware of any VR browsers that are doing this. Maybe you are thinking of the regular 2D browser? If so, that is a different use case and likely shouldn't be under the browser's control.
Regular WebXR is a single layer that is drawn at the device's refresh rate. Projection layers continue to have this requirement. |
VR platforms such as Windows Mixed Reality can have logic in the compositor itself to fall back to a stable lower frame rate if the current app is not hitting a steady frame cadence. This can happen in any app, including WebXR browser backends, and pages need to accept whatever |
That's an interesting point - is this session rate updating the device's actual refresh/composition rate, or just the rate at which the app is asked to produce projection layer frames for all layers? I believe it's the former, as that is what enables smoother video if something like 72Hz is chosen. This would affect the prose we write here. |
Should this happen when the author sets For Oculus, we'll always apply the requested frame rate but we'll drop to a lower one when the session is blurred. |
It is the rate at which the author should produce frames. In most headsets that is the same as the device's but for example for Magic Leap, it is half of the actual device rate.
I'm unsure if we need to call out this special case. For example animations or media won't be smooth if the author can request 72 fps but the device ends up reprojecting to 120fps. |
/agenda discuss the PR to query and control the device's frame rate |
Setting the frame rate to an unsupported value currently doesn't throw or report an error; instead it silently ignores it. |
Can we change it so the spec guarantees that the framerate is updated at the next Raf? |
Thinking about it some more, maybe it's better to make the framerate read-only and reflect the current frame rate. |
Device platforms will have various reasons that the frame rate needs to change that are out of the app's control. For example, on HoloLens 2, the frame rate drops from 60 to 30 when the user asks the system to start a video recording. Apps cannot presume that they will always get the frame rate they requested, and so I wouldn't tie any caveat here to whether the target frame rate is still at the default.
Some devices may not be able to switch frame rates that quickly - that seems like a heavy requirement for every device. The idea of a target frame rate seems reasonable on the partial interface XRFrame {
attribute double duration; // displayInterval?
} It feels weird to talk about the "frame rate" of a given frame, so this may be a better representation that is useful even for apps that don't change the target frame rate, to better understand any changes imposed by the underlying system for a given frame. |
Given this, maybe we need to change the proposal: partial interface XRSession {
readonly attribute long? targetFrameRate;
undefined setTargetFrameRate(long targetFrameRate);
readonly attribute Uint16Array? supportedFrameRates;
attribute EventHandler ontargetframeratechange;
}
This addresses the feedback I've gotten so far:
A drawback is that it's more complicated |
Can we do this as a separate issue? Measuring how much headroom there is so authors can tune their experiences should be its own thread. (I thought we already had an issue but I can't seem to find it...) |
To elaborate on #963 , did we ever discuss if the author can specify a refresh rate?
The Oculus Quest currently has a non-standard setting during session creation for low, mid and high refresh rate which is not ideal.
Could we make it so the author can state their preferred refresh rate and then the UA can choose to honor that.
For instance, if the author specifies 50 but the UA can only go as low as 60, 60 would be picked.
I'm unsure how we would go about integrating this into WebXR. Should we allow changing the refresh at runtime, or at session creation time?
The text was updated successfully, but these errors were encountered: