Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenXR's Controller Poses do not cover the necessary information #118

Open
nimsostudios opened this issue Mar 29, 2022 · 7 comments
Open
Labels
synced to gitlab A corresponding issue has been filed in the Khronos internal GitLab

Comments

@nimsostudios
Copy link

I had originally posted this on Reddit and a suggestion was made to create an issue for my recommendations here.
I've read the docs and have been working on converting some advanced VR avatar body systems to OpenXR.
I've found that the intended poses provided by the OpenXR spec (Grip and Aim Pose) aren't enough to be usable for what they are intended for.

The Aim Pose of course is useful for what's intended, but the Grip Pose is intended to allow positioning the in-world object correctly and that is not actually possible with the provided data. It is very much controller specific, which is not viable for supporting future hardware and hardware that the developer doesn't have access to.
I have a solution to this as described in my video explaining the issue.

Please see the following video that explains the issue in detail.
It is my understanding that this video may have already been discussed in a KhronosGroup call, but if not here is the link.
OpenXR Discussion Video

An additional point to add to the described "Palm Pose" would be to place it parallel to center palm, but instead of in the centroid, place it behind the Index finger, where the Index finger's first joint begins.

Please inform me if there is a better place to make these kinds of requests.

@rpavlik-bot rpavlik-bot added the synced to gitlab A corresponding issue has been filed in the Khronos internal GitLab label Mar 31, 2022
@rpavlik-bot
Copy link
Collaborator

An issue (number 1679) has been filed to correspond to this issue in the internal Khronos GitLab (Khronos members only: KHR:openxr/openxr#1679 ), to facilitate working group processes.

This GitHub issue will continue to be the main site of discussion.

@tangobravo
Copy link

tangobravo commented Apr 6, 2022

If you're showing a virtual object like a sword with a grip size that is different from the controller, I guess there are two choices for how to line those up - either in the centre of the controller (current grip space), or aligned with the palm - I'm not convinced either would feel particularly "right".

Exposing some information about the size of the controller might be one potential avenue for improvement - developers could then offset the grip space to the palm using that information if they wanted, or alternatively adjust the asset to align the physical controller radius with the virtual representation somehow.

In your video it looks like you're keen to render virtual hands using this new space, and I don't think another static controller-relative pose is quite enough for that use case. Perhaps better to have an extension to allow devices to expose "virtual hands" with joint poses based on controller tracking. With Oculus Touch you'd get the additional states based on the capacitive touch on the inputs, but even a simple rigid controller could provide a fixed hand skeleton in the common "neutral hold" pose expected for that particular controller design?

@nimsostudios
Copy link
Author

The object aligned to the current Grip Pose will not be correctly positioned unless you know the radius of the controller handle as you mentioned, but that still won't place the object correctly in terms of rotation.

In the video I highlight that the grip is simply not usable in this manner since you would hold different objects with different alignments in your hand (you don't hold a Vive wand with the same finger tilt as an Oculus controller), so the static Grip Pose doesn't pose the object, and as you can see it definitely doesn't pose the hand either.

In your example of the sword with a grip size different from the controller, the software developer already knows the grip size of the sword, they added the sword, so they can easily offset that amount from the palm point.

In the case of the Grip Pose though, even if they knew the radius of the controller, they would then have to offset twice, once towards the palm by the controller radius, then again out of the palm to correct that amount to match the sword handle, which is counter-intuitive as it means the actual Grip Pose has no meaning in the game world context.
In contrast to the Aim Pose, which does have meaning in game world context in relation to menus, so it's actually useful, just not for object handling.

Yes we do already have the skeleton, which will hopefully be correct on all devices, but that's far less likely to be properly implemented by a hardware developer than a single point on the controller. Additionally the intended reason for the Grip Pose is to position the in-game object, and that simply isn't accurate using the data we're given, even if we had additional data about the radius of the controller handle because you hold different controllers with different orientations.

The software developer already knows the radius of the in game object's handle, so if you know a point on the palm of the hand, you can position and rotate the object correctly, this is a workable solution and one that I have been using for years, but only with dedicated profiles for known controllers.
I place the in game hand according to the controller, all game functionality is then based on that hand, and it looks perfect, along with some advanced grab mechanics that allow the object to tilt in hand depending on how the hand is tilted, again highlighting that a static object pose relative to the controller is not the way to go.

If the hardware developer places a pose that gives us the position and logical orientation of a specific point on the palm, this starts us at the hand and we can do anything we need to in game.

You can see in the following image. (I copy pasted the hand, based on the Oculus controller, to highlight the problem)
The orientation of the sword (and other objects) should be authored by the game developer, but they don't know the hand orientation, making it impossible to orient consistently, since a different controller could be held differently too.
The developer doesn't actually need to know the orientation of the controller, they just need the hand, or at least a way to get to it.

image

@tangobravo
Copy link

I'm very new to OpenXR btw, just on my first spec read-though before attempting an implementation for our Zapbox product. I just watched your video (cool system btw) and thought I could help distil the 20 minute rant into something slightly more actionable :)

Grip space as currently defined for me is the centre of an idealised rod / baton held in the hand instead of the controller. It does have physical meaning and it seems to me the orientation of that should be the correct one to use for swords etc, as that should most closely approximate the hand configuration around the grip of the controller.

Your view is that a particular pose on the palm is a more useful reference. I assume you consider it's preferable for users to ignore the hand configuration (where the fingers actually are) and the feel of the controller in the hand in if the object would usually be held in a different way to the controller. User testing might be needed to confirm or deny that I suppose.

For some objects (eg a pen) the palm reference doesn't seem right either, the natural assumption would be that it is attached more between the thumb and index finger, so to me a single fixed pose still doesn't quite seem flexible enough.

My suggestion on the skeleton is distinct from a fully hand-tracked implementation - it would be an exclusively controller-driven implementation, based on then specific fixed controller geometry and whatever touch / inputs were currently doing, and would provide a more reasonable "virtual hand" baseline for any object positioning.

@Rectus
Copy link

Rectus commented Jun 24, 2022

That was surprisingly fast. The newest spec update added the XR_EXT_palm_pose extension, and it is already supported in the latest SteamVR beta runtime.

The documentation doesn't seem to have been updated yet, but the extension text is available here:
https://github.com/KhronosGroup/OpenXR-Docs/blob/main/specification/sources/chapters/extensions/ext/ext_palm_pose.adoc

@rpavlik
Copy link
Contributor

rpavlik commented Jun 24, 2022

:)

We might have been working on it for months already when the video came out... 😉

OK to close this as resolved now? Or is more clarity needed?

@amalon
Copy link

amalon commented Sep 25, 2022

After a lengthy back and forth about SteamVR's palm pose (which is at least 40 degrees out IMO):
https://steamcommunity.com/app/250820/discussions/3/5234891274094710186

I think a little more clarity is needed about the palm orientation's -Z axis. There is some ambiguity left in straightening the index finger. Clearly the index finger must be perpendicular to the X axis (palm normal), so not stretched right back, but it can still point out at a range of angles in the plane of the palm without the palm itself moving or rotating.

I can think of a couple of interesting options to resolve it:

  1. Specify that the index finger must be relaxed in its rotation about the X axis. When I flex my straightened index finger there is a clear angle about the X axis at which it is more relaxed.
  2. Define it instead based on the (relaxed) straightened middle finger. When i stretch my fingers out away from one another, and back together while keeping them in the plane of my palm (e.g. back of hand & fingers against a flat surface), my middle finger barely moves or rotates at all relative to my palm, which may make it a less ambiguous basis for palm -Z.

Also, the diagrams (figure 7 in 1.0.25) have fingers roughly parallel (not relaxed about the x axis), and the one that shows the fingers not quite parallel (bottom middle) has -Z more parallel with middle finger than the index finger. Either way they should be exactly consistent with any new wording and preferably show fingers relaxed / not quite parallel if they aren't expected to necessarily be so in the definition of the axes.

I think I slightly prefer option 2 here to reduce the chance of misinterpretation, but would happily accept option 1 if that's how other runtimes than SteamVR already implement it. Thoughts?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
synced to gitlab A corresponding issue has been filed in the Khronos internal GitLab
Projects
None yet
Development

No branches or pull requests

6 participants