Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More hand visualization options #7

Closed
maluoi opened this issue Jul 31, 2019 · 4 comments
Closed

More hand visualization options #7

maluoi opened this issue Jul 31, 2019 · 4 comments
Labels
feature Related to new feature work

Comments

@maluoi
Copy link
Collaborator

maluoi commented Jul 31, 2019

  • Toggle hand visualization on/off.
  • Let user swap to user defined materials.
  • User defined meshes too?
@maluoi maluoi added the feature Related to new feature work label Jul 31, 2019
@maluoi maluoi added this to Todo in StereoKit v0.1 via automation Jul 31, 2019
@DirkSonguer
Copy link
Contributor

What I love about the MRTK default (editor) hands is that you can see the joints. The same is true for the Leap Motion standard visualisation. Maybe something simple like rendering spheres at the joint position and then move along and connect them with a simple geometry?

@maluoi
Copy link
Collaborator Author

maluoi commented Aug 2, 2019

I've never been a big fan of those debug visualizations, they always strike me as a little bit heavy on the 'developer placeholder' vibe. Maybe cat explorer style hands if we add anything in that direction?
http://blog.leapmotion.com/designing-cat-explorer/

maluoi added a commit that referenced this issue Aug 2, 2019
@DirkSonguer
Copy link
Contributor

Well, currently the hand visualisation is exactly that: A visualisation. It isn't really tracking the hand or fingers per se, so at some point that needs to be either re-written to be a more general input system understanding (different types of) controller(s) vs. hands and fingers which are individually tracked. Or it's kept as an artistic abstraction of that and in that case it's fine to have a more creative view on it.

@maluoi
Copy link
Collaborator Author

maluoi commented Aug 2, 2019

The hand setup I have should work pretty well when hand tracking APIs are added in. Current hand poses are even recorded straight from my Leap Motion! What's present at the moment is a simulation of hand pose data that's blended by controller events, so it's as close to real hands as you'll get without an actual hand sensor.

Ultimately, I want to be able to swap in a skinned mesh, so people can do whatever they want for their hands! Skinned meshes are tricky though, so I'm not keen on doing that right away. But in the meantime it's also super helpful to have a couple of really easy good-looking options to just start customizing right away! In many cases, people might not even need more than that for their applications :)

When it comes to hands vs. controllers, I'm still settling my thoughts, but right now I'm mostly focusing on controllers as a simulation of hands! I think hands are suuuper important for general MR input, so they're the bits to focus on and get right first. Controllers are more like specialized tools, so I think I'd want to treat them pretty differently.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Related to new feature work
Projects
None yet
Development

No branches or pull requests

2 participants