Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow pushing user interface buttons without a hand controller button click #15

Open
tomgoddard opened this issue Oct 14, 2023 · 0 comments

Comments

@tomgoddard
Copy link
Contributor

Currently the user touches a button so it highlights blue and then clicks the hand controller trigger button to press the button. I did it this way to avoid accidentally pressing buttons. But it is not very intuitive. It might be better if just pressing and releasing the button was sufficient. The "release" would mean that the button must be poked and then unpoked. The release would help assure that buttons are not accidentally pressed. Need to try this to see if it is easy, reliable, and more intuitive.

Besides making the user interface interaction more obvious, this method of pressing buttons could be made to work in the same way with hand gestures without controllers. Both Meta Quest 3 and Apple Vision Pro are promoting hands-only interaction.

The push and release interaction also paves the way to have panels sit on the users arm or on physical room surfaces found by depth mapping, which would allow the user to actually have tactile feedback of touching a solid surface to active a button.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant