Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to customize HPTK wtih the custom model-based hand tracking? #3

Closed
f6ra07nk14 opened this issue Dec 28, 2020 · 4 comments
Closed

Comments

@f6ra07nk14
Copy link

I want to implement simple AR demo with physics engine and model-based hand tracking.
My model-based hand tracking comes from MediaPipe Hand Tracking, and I know I need to implement basic rumtime for MediaPipe Hand Tracking.
I had read the wiki, but I still lack some information; for example, which class need to be inherited, or what function I need to re-implement.

Thanks in advance.

@jorgejgnz
Copy link
Owner

I have never used MediaPipe Hand Tracking but I will try to explain in detail how to add support to new devices and hand tracking data sources.

HPTK Input module requires an array with fixed length and fixed order where each item contains the position and rotation of each bone. InputController will postprocess these values and will apply them to master hand according to the rig mapping specified in InputModel.

The class of the items of the afforementioned array is AbstractTsf which contain position, rotation, scale and space of application (world or local). It's called AbstractTsf as it contains the spatial information of a Transform but without being connected to any gameobject.

The class InputDataProvider contains and update the array that is accessed by Input module (InputDataProvider.bones). InputModel.inputDataProvider contains the reference to a component that inherits InputDataProvider class.

To add support to a new device is required to create a new script that inherits from InputDataProvider and that overrides its function UpdateData.

InputController will call UpdateData function every frame, before accessing the array. InputController will update position and rotation for InputModel.wrist and will only update rotation for the rest of bones.

This is the ordered list of bones:

0 - wrist
1 - forearm

2 - thumb0
3 - thumb1
4 - thumb2
5 - thumb3

6 - index1
7 - index2
8 - index3

9 - middle1
10 - middle2
11 - middle3

12 - ring1
13 - ring2
14 - ring3

15 - pinky0
16 - pinky1
17 - pinky2
18 - pinky3

19 - thumbTip
20 - indexTip
21 - middleTip
22 - ringTip
23 - pinkyTip

In your overrided UpdateData function you will have to go through InputDataProvider.bones and read or calculate, at least, the world position and rotation of wrist (.bones[0]) and the world or local rotation of the rest of bones.

What if your hand tracking data source provide data for less bones? You can bypass that item so it will keep its original values or you can also pass constant values for position and/or rotation.

What if your hand tracking data source provide offseted rotations? Add a variable (Quaternion) to your InputDataProvider and apply it to every obtained rotation. Then tweak this variable from Inspector until the bone rotations is correct in master hand.

Examples of InputDataProviders
OVRSkeletonTracker
RelativeSkeletonTracker

I hope this has been helpful to you :)

@f6ra07nk14
Copy link
Author

Thank you for your prompt reply. :)

In brief, the simple way is that I just need to inherit the class InputDataProvider, and re-implement the function UpdateData to make it provide the Transforms of each bone.
I already known how to convert the hand joints provide by MediaPipe Hand Tracking to the above ordered list of bones (Oculus version).

How about the camera (like MainCamera or ARCamera) works with the HPTK? or I can ignore this question.
In my case, because my camera was fixed at the same place, unlike Oculus, Oculus has 6 DOF to dynamically track the Head-mounted display.

@jorgejgnz
Copy link
Owner

jorgejgnz commented Dec 29, 2020

CoreModel component in HPTKCore prefab contains a reference to the camera that represents the point of view (PoV) of the player. This is access by any AvatarController when its AvatarModel.followsCamera is enabled. AvatarController will move AvatarMovel.headSight. This headSight transform and the transform of both hands drive the rest of the parts of the body, including the shoulders. The shoulders are required to calculate the direction of HandModel.ray when ProxyHandModel.rays is enabled.

If you want to use hand rays or other body parts of the avatar in your app and the camera will represent the PoV of the player, then you should pass the camera to CoreModel.trackedCamera and make sure that AvatarModel.followsCamera is enabled.

If your camera won't follow the PoV of the player (mobile AR or desktop applications), leave CoreModel.trackedCamera empty and AvatarModel.followsCamera disabled.

AvatarModel and AvatarController may receive changes in the future.

@f6ra07nk14
Copy link
Author

Thanks! 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants