Hand Physics Toolkit (HPTK) is a toolkit to implement hand-driven interactions in a modular and scalable way. Platform-independent. Input-independent. Scale-independent. Can be combined with MRTK-Quest for UI interactions.
You can clone a ready-to-go project at HPTK-Sample.
- Data model to access parts, components or calculated values with very little code
- Code architecture based on MVC-like modules. Support to custom modules
- Platform-independent. Tested on VR/AR/non-XR applications
- Input-independent. Use hand tracking or controllers
- Pupettering for any avatar or body structure
- Scale-independent. Valid for any hand size
- Realistic configurable hand physics
- Define strategies to deal with tracking loss
- Physics-based hover/touch/grab detection
- Tracking noise smoothing
- Unity 2020.x
- Unity 2019.x
- Oculus Quest 1/2 - Android
- Leap Motion - Standalone
- Hololens 2 - UWP
- Oculus Touch
- WMR
- Vive
- OpenVR
- Universal Render Pipeline (URP)
- Standard RP
- Obtain HPTK
- Change ProjectSettings & BuildSettings
- Import the built-in integration packge (if needed)
- Drag & drop the default setup to your scene
- Build and test
Check Wiki for a detailed step-by-step guide.
Jorge Juan González - HCI Researcher at I3A (University of Castilla-La Mancha)
Oxters Wyzgowski - GitHub - Twitter
Michael Stevenson - GitHub
Nasim, K, Kim, YJ. Physics-based assistive grasping for robust object manipulation in virtual reality. Comput Anim Virtual Worlds. 2018; 29:e1820. https://doi.org/10.1002/cav.1820
Linn, Allison. Talking with your hands: How Microsoft researchers are moving beyond keyboard and mouse. The AI Blog. Microsoft. 2016 https://blogs.microsoft.com/