An immersive VR ocean exploration experience with biometric interaction using EEG and EMG signals.
Ocean Daze is a Unity-based virtual reality project that combines underwater exploration with real-time biometric feedback. Created during the 2025 Hack1Robo Hackathon, this project allows users to interact with a virtual ocean environment using brain (EEG) and muscle (EMG) signals through Lab Streaming Layer (LSL) integration.
- Immersive VR Environment: Explore a beautiful underwater world with realistic water shaders and effects
- Biometric Interaction: Control aspects of the experience using EEG (blink detection, jaw clench) and EMG signals
- Real-time Signal Processing: Python-based LSL streams for live physiological data integration
- Cross-Platform XR Support: Compatible with various XR devices including Android XR and OpenXR platforms
- Marine Life: Interact with underwater creatures and environments
- Advanced Water Effects: Featuring underwater post-processing and realistic water shaders
- Unity Version: 2023.x (Universal Render Pipeline)
- XR Interaction Toolkit: 3.2.1
- XR Hands: 1.6.1
- Lab Streaming Layer for Unity: LSL4Unity integration
- Universal Render Pipeline (URP): 17.0.3
- OpenXR: 1.15.1
- Android XR OpenXR: 1.0.1
- pylsl: Lab Streaming Layer Python interface
- numpy: Numerical computing
- scipy: Signal processing (bandpass filtering, Welch method)
- Unity 2023.x or later
- Universal Render Pipeline (URP) support
- XR-compatible VR headset (Meta Quest, Pico, or OpenXR-compatible device)
- Windows, macOS, or Linux development environment
- Python 3.7 or later
- Lab Streaming Layer (liblsl)
- EEG/EMG hardware compatible with LSL (e.g., OpenBCI, Muse, or similar)
- Required Python packages (see Installation)
-
Clone the repository:
git clone https://github.com/yourusername/OceanDaze.git cd OceanDaze -
Open in Unity:
- Open Unity Hub
- Click "Add" and select the
OceanDazefolder - Open the project with Unity 2023.x or later
-
Install dependencies:
- Unity will automatically resolve package dependencies from
Packages/manifest.json - Wait for package installation to complete
- Unity will automatically resolve package dependencies from
-
Open the main scene:
- Navigate to
Assets/Scenes/ - Open
BasicScene.unityorSampleScene.unity
- Navigate to
-
Navigate to the Python scripts directory:
cd LSL_Ocean_Daze -
Create a virtual environment (recommended):
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install required packages:
pip install pylsl numpy scipy
-
Connect your VR headset:
- Ensure your XR device is properly connected and recognized
- Enable developer mode if required (e.g., for Meta Quest devices)
-
Configure XR settings:
- Go to
Edit > Project Settings > XR Plug-in Management - Enable your target platform's XR plugin
- Go to
-
Run the scene:
- Click the Play button in Unity Editor, or
- Build and deploy to your XR device
-
Set up your EEG/EMG hardware:
- Connect your biometric sensors
- Ensure they are streaming data via LSL
-
Run the appropriate Python script:
For EEG blink and jaw clench detection:
python eeg_blink_jaw_updated.py
For EMG blink detection:
python Emg_detection_blinks.py
For EMG duration tracking:
python emg_dur.py
For signal visualization:
python plotting_signal.py
-
Start the Unity application:
- The Unity application will automatically detect and connect to LSL streams
- Biometric events will trigger interactions in the VR environment
OceanDaze/
โโโ Assets/
โ โโโ Hack1robo/ # Main project assets
โ โ โโโ Audio/ # Sound effects
โ โ โโโ Creatures/ # Marine life models
โ โ โโโ Models/ # 3D environment models
โ โ โโโ Prefabs/ # Reusable game objects
โ โ โโโ Scenes/ # Unity scenes
โ โ โโโ Scripts/ # C# gameplay scripts
โ โ โโโ Terrains/ # Underwater terrain assets
โ โ โโโ Textures/ # Visual textures
โ โโโ IgniteCoders/ # Water shader system
โ โโโ Paro222/ # Underwater effects
โ โโโ Samples/ # Sample scenes and tutorials
โ โโโ Scenes/ # Main application scenes
โ โโโ VRTemplateAssets/ # VR interaction templates
โ โโโ XR/ # XR configuration
โโโ LSL_Ocean_Daze/ # Python LSL scripts
โ โโโ eeg_blink_jaw_updated.py # EEG processing
โ โโโ Emg_detection_blinks.py # EMG blink detection
โ โโโ emg_dur.py # EMG duration tracking
โ โโโ plotting_signal.py # Signal visualization
โโโ Packages/ # Unity package dependencies
โโโ ProjectSettings/ # Unity project configuration
โโโ README.md
- EEG Blinks: Frontal electrode blink detection (Fp1/Fp2)
- Jaw Clenches: EMG detection from jaw muscles
- EMG Events: General muscle activation patterns
- Bandpass Filtering: Removes noise and isolates relevant frequency bands
- Threshold Detection: Identifies events based on statistical analysis
- Refractory Periods: Prevents false positives from rapid events
- LSL Streaming: Sends processed events to Unity in real-time
We welcome contributions! Please feel free to submit pull requests or open issues for bugs and feature requests.
- Oliwia Jankowska
- Max Chalabi
- Massimo Ruben
- Margot Fita
- Karolina Piwko
- Alexia Ilie
- Ervan Achirou
Ocean Daze is free software distributed under the GNU General Public License v3.0 (or later).
Copyright ยฉ 2025 Oliwia Jankowska, Max Chalabi, Massimo Ruben, Margot Fita, Karolina Piwko, Alexia Ilie, Ervan Achirou.
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
- Created during the 2025 Hack1Robo Hackathon
- Built with Lab Streaming Layer
- Water shaders by IgniteCoders
- Underwater effects by Paro222
- Unity XR Interaction Toolkit samples
For questions, issues, or feedback:
- Open an issue on GitHub
- Contact the development team through the repository
- Additional biometric interaction modes
- More marine environments and creatures
- Multiplayer support
- Enhanced visualization of biometric data
- Additional VR platform support
- Procedural ocean generation
Dive into the depths. Control with your mind. Experience Ocean Daze. ๐๐ง