-
Notifications
You must be signed in to change notification settings - Fork 294
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
didn't manage to get hand tracking working #394
Comments
hand-tracking-quest2.webmSure, motion in this vid is done in quest 2, recording is from desktop. url: https://naf-examples.glitch.me/tracked-controllers.html Did it just now. Would love to know what's going on with your setup, but vincent and I have never been able to replicate your issues unfortunately, would need new info. |
The binary curling of fingers on the video, the fact that there is not 2 hands but one with one controller and finally the filename, tracked-controllers make me believe there is a misunderstanding. Does this example requires controllers? I have only tried without controllers, as in relying on Quest hand tracking, i.e physical hands being tracked with its video cameras, not with physical controllers. |
Ah! This component is not compatible with the actual hand tracking API. I do realize the naming is confusing, but that's because it's inhereting the naming from the a-frame component that predates the hand-tracking API. In other words, this is a networked version of A-Frame's hand-controls component. Specifically, the hand controls component generates "hands" with gestures that match certain button click configurations. This component networks those gestures, and further supports a 'controller' model as an option that shows the actual buttons and joystick clicks instead of the faux-hand and faux-gestures. In both cases, though, yes, controllers are required. I've wanted to make a component that networks the skeleton pieces from the hand-tracking API--I think that would be magical and great--but I never got around to it. But I'm glad that clear it up. In this video demo, one hand is rendered as a controller, and one is rendered as a hand, just to show both options of display--but both are indeed controllers. You noted the binary finger curling, btw--while that indeed is a feature of how this is designed, that is again an artifact of inheriting from the hand-controls component. A better designed component would use the precision button inputs (which are not binary on the grip and trigger) to map more nuanced gestures, but that's significantly more complex, and I decided not to tackle that one, as it's not my forte. This component just triggers built-in animations. Ideally there would at least be transition animations from every gesture to every gesture, but those are also not available, nor are the transitions especially smooth. Still, in spite of those limitations, I think this component is pretty great. Now that you mention it, going back and building that is pretty tempting. I have always wanted that feature, and hand tracking on the quest 2 has gotten much better than it used to be. |
How would one go to start with just hand position? I'm thinking a sphere again would be a good start. It's not as precise as a hand model with position, rotation, and all joints, but I believe it's enough for some fun behaviors, including waving, pointing to objects, etc. |
Working example https://twitter.com/utopiah/status/1711375479335227530 if others are interested. One can't "just" use a template with the hand tracked entity as it does not have a post but rather traverse through it to get the wrist (and optionally other joints) then make an invisible entity then network that one. An indirect solution with some overhead but seems to work. |
disentangling from #381
Could someone please share a working example with both URL and video recording of a real session (no emulator)?
The text was updated successfully, but these errors were encountered: