Skip to content

Eye-tracking and face analysis tool in AR, for recording user's focal areas and predicting users' demographics and emotions when reading marketing materials. Dynamically store and visualize analysis data. First place, Best Buy Prize Winner out of 400+ people.

License

superzzp/Visual-Eyes-BizHacks

Repository files navigation

Visual Eyes

Powered by Azure Cognitive Services, ARKit2, Firebase and React

Eye-tracking and face analysis in AR, for recording user's focus and predicting user's geographics and emotion when reading marketing materials. Dynamically store and visualize analysis data. Proof of concept at BizHacks.

First Place, Best Buy Price Winner out of 400+ people.

Visual.Eyes.Demo.mp4

Click the image above to watch a demo

webapp-screenshot

Architecture

The iOS app tracks user's eye, calculates user's focal area and displays it on screen. It is also taking a facial snapshot of users for every 2 seconds, and uses Azure face analysis APIs from cognitive services to predict user's age, gender and emotion during a set period. All the user data generated by the app are uploaded to Firebase in real time, grouped by unique usernames and upload times.

In addition, a client website demo serves as a data visualization tool. It can display accumulated user emotions as a pie chart. It is built with React and JavaScript and reads data from the Firebase server and converts it into beautiful graphs built with Plotly.

Team Photos

pics

About

Eye-tracking and face analysis tool in AR, for recording user's focal areas and predicting users' demographics and emotions when reading marketing materials. Dynamically store and visualize analysis data. First place, Best Buy Prize Winner out of 400+ people.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published