Research done on sensor data.
Helps a user recover from a panic attack with protocols by providing visual feedback, binaural beats etc. to guide them to a state of calmness.
We used OpenBCI equipment to read in EEG data from the user, as well as PPG data through a pulse sensor, and additionally a raspberry PI which uses computer vision to understand how tense a user is. Combining all these data points, we have created a framework to be able to determine the real time state of a user, and whether they are in a panicked state or not.
Unable to get OpenBCI data directly onto the Meta Quest due to OpenBCI requiring a desktop app to run the application.
Distilling complex brain signal data, pulse data and video data into simple easy to understand data.
Expanding with more sensor data, learning what protocols work best, and personalised experiences.