Skip to content

salmanmkc/MIT_Hackathon_2025_EmotionalDistress_OpenBCI_Qualcomm_BrainComputerInterface

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Inspiration

Research done on sensor data.

What it does

Helps a user recover from a panic attack with protocols by providing visual feedback, binaural beats etc. to guide them to a state of calmness.

How we built it

We used OpenBCI equipment to read in EEG data from the user, as well as PPG data through a pulse sensor, and additionally a raspberry PI which uses computer vision to understand how tense a user is. Combining all these data points, we have created a framework to be able to determine the real time state of a user, and whether they are in a panicked state or not.

Challenges we ran into

Unable to get OpenBCI data directly onto the Meta Quest due to OpenBCI requiring a desktop app to run the application.

Accomplishments that we're proud of

Distilling complex brain signal data, pulse data and video data into simple easy to understand data.

What we learned

What's next for Panic Button

Expanding with more sensor data, learning what protocols work best, and personalised experiences.

About

MIT RealityHack

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published