Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Eye Roll/Pitch/Yaw from IrisTracking? #1030

Open
mattycorbett opened this issue Oct 3, 2023 · 1 comment
Open

Eye Roll/Pitch/Yaw from IrisTracking? #1030

mattycorbett opened this issue Oct 3, 2023 · 1 comment
Labels
type:support Support issue

Comments

@mattycorbett
Copy link

mattycorbett commented Oct 3, 2023

Plugin Version or Commit ID

0.12.0

Unity Version

2022.2.19

Your Host OS

Windows 11

Target Platform

Android

Description

I have the IrisTracking Solution running without issue. However, I'm trying to estimate gaze based on the output of the FaceLandmarks, specifically the ones that correspond to the detected Iris. Does anyone have such an example? I cannot find one. I know its possible, as Google's advertisements show using this data to direct the eyes of avatars as in here. If BlendShapes would help, how do I enable them on the config? I can't seem to find where to do what (or to set min detection confidences, etc).

Code to Reproduce the issue

No response

Additional Context

No response

@mattycorbett mattycorbett added the type:support Support issue label Oct 3, 2023
@HarryJCooper
Copy link

Hi Matty!

I'm actually working on the same thing. Did you make much progress? I've tried a number of approaches:

  • Placing spheres directly behind the eyes, updating their position relative to nearby points, and casting a ray out through the pupil, colliding with a screen and converting that collision point into UI space.
  • Getting pupil distance from the outer eye point and rotating the pupil point relative to that, and casting a way.
  • Taking the ratio between distance of left pupil from left eye corner, and distance of right pupil from right eye corner
  • some variants on the above.

Ultimately, I have found that the output from the FacialLandmarks is too variable to be reliable with any calibration, and also changes over time within the same session, so ultimately I've landed on a gestural system for looking left and looking right, changing the 'gaze point' to difference in game objects, but this isn't anywhere near to the fidelity I'd hoped for.

Would love to get your thoughts/know what you tried.

Thanks
Harry

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:support Support issue
Projects
None yet
Development

No branches or pull requests

2 participants