New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gaze tracking is not accurate #2
Comments
Hey @begies-projects, So, if I understand it right, you successfully calibrated your camera and now want to try out the demo.
Do you mean in the 3D visualization of the environment? Is your face plotted in the 3D visualization? The ray intersecting with the screen in the 3D visualization should directly correspond with the "lasterpointer". gaze-tracking-pipeline/utils.py Line 9 in be33d7e
That sounds like the model always predicts the same values. Maybe log the predications here and see if there is something wrong. You can also visualize the preprocessed images with the
I could be, but even without calibration, the model should predict the rough direction. If you collect calibration data (images of the face + actual point on the screen) you could also visualize this data to make sure that the setup is correct. Have a look at this repository, where I have already written the necessary code. |
Hello @pperle I ran the
On the main screen with a red laser pointer I can only see a part of the screen, so I ran Another thing is that as you mentioned the model mostly predicts only the central values, so that the red laser is in the center almost all the time. I looked at the gaze calibration data collection and even implemented it but there it cannot complete the data collection. I looked at the black screen with E letter and pressed the arrow keys for almost 10 mins and it still didn't finish, is that normal? Are there any suggestions on how I can improve the model output/predictions so that I can get at least the relative points on the scree? |
@begaiym-k did you figure this out? |
@begaiym-k Could you solve this issue? |
@emrecolak55, @hshahid no, unfortunately I couldn't make the software fully functional. |
Hi! First of all thanks for publishing your work, it is really helpful! I have been working on my graduation project which is very similar to yours and I wanted to run your program to get some idea on how it works.
I was able to calibrate and get the yaml file, then I ran the main,py and manually set entered the screen sizes. However, when I run it the screen appears to be way smaller then the actual screen (playing with the numbers didn't help), the laser doesn't appear , and it shows the red line mostly in the center unless I move my head than the line starts moving towards the side my head moved.
Can it be because of the calibration? Or what might cause it? How did it work for you, can you share with more details please?
Thank you!
The text was updated successfully, but these errors were encountered: