Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hand gesture tracking controls #4345

Closed
wants to merge 4 commits into from
Closed

Hand gesture tracking controls #4345

wants to merge 4 commits into from

Conversation

takahirox
Copy link
Contributor

@takahirox takahirox commented Jun 21, 2021

This is a project from the hack week. I think it is still at a phase of proof of concept but there may be devs who are interested in so I would like to share the code as a draft PR so far.

Screenshot

Room._.App.by.Company.-.Google.Chrome.2021-06-18.18-53-19.mp4

Features

  • AI recognizes your hands in the web cam and reflects the hand gesture to the avatar hands
  • Users will be able to control avatar's hands even without VR headsets

Implementation

  • Imports Google's Mediapipe Hands and its helpers
  • Downloads the WebAssemblies and models on demand from CDN server
  • Mediapipe runs locally and it recognizes the hands gesture from webcam stream each animation (or a certain) frame
  • Hubs handles Mediapipe gesture tracking as user input and reflects it to avatar's hands

Remaining tasks and challengings

  • Fingers animation and network support. If I'm right we haven't support finger's skeleton animation and network support yet and instead we use animation id. If we start to support WebXR Hands API Hand Tracking for supporting sign language #2812 we will need to send finger's transforms. We will be able to use the finger's animation network system even for this hand gesture tracking controls.
  • This feature should be moved under the preference although currently it's enabled on desktop by default
  • Evaluate the performance and accuracy
  • Where should the webcam stream be placed although currently it's placed at right top?

Notes

  • This feature should be easily extended with other poses like face angle and emotion trackings as @keianhzo suggested

┆Issue is synchronized with this Jira Task

@netpro2k
Copy link
Contributor

netpro2k commented Nov 1, 2022

This was a hackweek project. Since then hand tracking has started to become available in the WebXR API, so if we want to implement it we should probably do so through that. To get the effects of this PR though we would need some sort of polyfil or browser extension to take camera data and convert it to WebXR hand input. Could be interesting to try. Closing this PR though and keeping the branch around for reference.

@netpro2k netpro2k closed this Nov 1, 2022
@takahirox takahirox deleted the HandGesture branch November 1, 2022 01:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants