This project creates a real-time music visualization and instrumentalization system combined with EEG-based brain-computer interface that:
- Visualizes audio files in Unity using particle systems and VFX graphs
- Empowers players to become musicians through hand-tracking instrumentation
- Integrates real-time EEG brain activity (concentration and relaxation levels) to drive gameplay mechanics
- Shows how brains can synchronize during common interaction and exposure to music stimulation
Each brainwave and musical note is represented as a colored particle, with the particle size corresponding to the note's amplitude/velocity. Musicians can play bass, snare drum, xylophone, tom drums, and cymbal instruments. The brain data is converted to a Unity VFX graph particle system which represents each brain wave with a different color and interaction.
Key Features:
- Real-time EEG brain-computer interface via MindRove hardware
- Multiplayer networking using RiptideNetworking
- Visual feedback through Unity VFX graph
- Hand-tracking instrumentation for interactive music creation
- WAV file visualization with particle systems
Players wear EEG headsets that measure their brainwave activity. The system analyzes concentration and relaxation levels through computation of raw EEG data, converted to known brain waves (alpha, beta, theta, gamma, delta) in real-time, translating mental states into complex interactive particle systems.
The experience connects IRL and Virtual project began with the goal of using an Arduino Uno Q as a MIDI controller to generate music data. The idea was to create a hardware interface that could send MIDI signals to a visualization system.
Unfortunately, the Arduino Uno Q proved inadequate for this purpose. The Arduino platform lacked the necessary capabilities to:
- Generate complex musical patterns in real-time
- Output MIDI data in a format suitable for visualization
- Handle the processing requirements for music generation
To overcome these limitations, the project pivoted to using Sonic Pi, a code-based music creation and performance tool. Sonic Pi uses the Ruby programming language to generate music programmatically, offering:
- Real-time music synthesis
- Built-in MIDI output capabilities
- Flexible pattern generation
- Rich sound synthesis options
Sonic Pi generates both audio output (so you can hear the music) and MIDI data (for visualization purposes).
To transmit the MIDI data from Sonic Pi to Unity, the project initially used Open Sound Control (OSC), a protocol designed for networking sound synthesizers, computers, and other multimedia devices.
The system was configured to send OSC messages on port 12345, transmitting:
- Note information (pitch, velocity)
- Amplitude/volume data
- Timing information
- Musical event markers (beats, bass, melody, hi-hats)
The next step in the solution was to leverage WAV files directly in Unity. This approach simplified the system by:
- Eliminating the need for external applications (Sonic Pi)
- Using Unity's built-in audio analysis capabilities
- Processing audio files directly within the Unity environment
- The WAV file became the direct source of particle generation
Unity's audio system analyzes the WAV file in real-time, extracting frequency and amplitude data to drive the particle visualization system.
The final system merged the WAV file-based visualization with hand-tracking instrumentation. This integration allows:
- Interactive music creation through hand movements
- Real-time sound generation based on hand gestures and positions
- Procedural sound synthesis triggered by hand tracking
- Combined visualization of both pre-recorded audio (WAV) and live hand-generated sounds
The hand-tracking component uses VR/AR hand tracking (Meta Quest 3, OpenXR) to detect hand movements and generate sounds, which are then visualized alongside the WAV file audio as particles.
The system was further enhanced with EEG brain-computer interface capabilities:
- Real-time brainwave analysis using MindRove hardware
- Conversion of brain activity to visual particle systems
- Multiplayer synchronization showing brain synchronicity between users
- Each brain wave type represented with different colors and interactions
The hand-tracking instrumentation components were developed in a separate repository focused on music-brain-computer interface (BCI) research. The development work, including project settings, configuration, and initial implementation, can be found in the MusicBCI repository.
Key Development Areas:
- Project Settings Configuration: Unity project settings and build configurations for VR/AR hand tracking
- Hand Tracking Integration: Initial development of hand tracking interfaces for Meta Quest 3 and OpenXR
- Procedural Sound Synthesis: Development of algorithms for generating sounds from hand movements
- BCI Integration Research: Exploration of brain-computer interface applications for music creation
The instrumentation components developed in that repository were then integrated into this project, where they work alongside the WAV file visualization system to create a combined audio-visual experience. The hand-tracking scripts in this repository (Quest3HandSoundController.cs, ProceduralHandSound.cs, SimpleHandSound.cs) are based on and evolved from the work done in the MusicBCI repository.
For more details on the instrumentation development process, project settings, and configuration, see the MusicBCI ProjectSettings directory.
Unity processes audio data (from WAV files and hand-tracking) and visualizes it using particle systems. The visualization system includes:
- Multiple C# scripts that process audio data and create visual effects
- Particle systems that respond to musical events in real-time
- Color mapping: Each note is represented by a particle color (mapped by pitch)
- Size mapping: The louder the note (higher amplitude/velocity), the bigger the particle
Between-US_EEG/
├── concentration/ # EEG analysis & visualization project
├── Build/ # Compiled executables
├── Build02/ # Alternative build
├── build3/ # Functional Build with data extraction and normalisation
├── Client_Between-US_EEG/ # Unity multiplayer client (FOR TESTING PURPOSE ONLY)
└── Server_Between-US_EEG/ # Unity multiplayer server (FOR TESTING PURPOSE ONLY)
flowchart TB
A[WAV Audio File<br/>Unity Assets] -->|Audio Analysis| B[Audio Processing<br/>Unity C# Scripts]
C[Hand Tracking<br/>VR/AR Input] -->|Hand Movements| D[Procedural Sound<br/>Generation]
E[EEG Headset<br/>MindRove] -->|Brain Signals| F[BrainWaveAnalyzer<br/>Signal Processing]
D -->|Audio Data| B
F -->|Brain Wave Data| G[VFX Graph<br/>Brain Visualization]
B --> H[Visualization Scripts<br/>MidiVisualizer<br/>Audio Visualizer]
H --> I[Particle Systems<br/>Unity Engine]
G --> I
I --> J[Visual Display<br/>3D Particles]
A -.->|Playback| K[Speakers<br/>Sound]
D -.->|Live Audio| K
style A fill:#e1f5ff
style B fill:#fff4e1
style C fill:#fce4ec
style D fill:#fce4ec
style E fill:#e8f5e9
style F fill:#e8f5e9
style G fill:#e8f5e9
style H fill:#fff4e1
style I fill:#e8f5e9
style J fill:#e8f5e9
style K fill:#f3e5f5
The project initially used Sonic Pi with OSC communication:
flowchart LR
A[Sonic Pi<br/>Ruby Music Generation] -->|OSC Messages<br/>Port 12345| B[OSC Receiver<br/>Unity C# Script]
B --> C[Visualization Scripts<br/>MidiVisualizer<br/>HipHopVisualizer]
C --> D[Particle Systems<br/>Unity Engine]
A -.->|Audio Output| E[Speakers<br/>Sound]
D --> F[Visual Display<br/>3D Particles]
style A fill:#e1f5ff
style B fill:#fff4e1
style C fill:#fff4e1
style D fill:#e8f5e9
style E fill:#fce4ec
style F fill:#e8f5e9
The final system uses WAV audio files directly in Unity as the primary source of music data:
- Audio files are placed in Unity's
Assets/StreamingAssets/folder - Unity's
AudioSourcecomponent plays the WAV files - Real-time audio analysis extracts frequency and amplitude data
- This data drives the particle visualization system
Key Features:
- No external applications required
- Direct audio file processing
- Real-time frequency analysis
- Amplitude-based particle generation
The system integrates hand-tracking for interactive music creation:
- VR/AR Hand Tracking: Supports Meta Quest 3 and OpenXR
- Procedural Sound Generation: Hand movements trigger sound synthesis
- Movement-Based Audio: Speed and position of hands affect sound parameters
- Gesture Recognition: Different hand gestures produce different sounds
Hand-Tracking Scripts:
Quest3HandSoundController.cs- Meta Quest 3 hand tracking integrationProceduralHandSound.cs- Procedural sound generation from hand movementsSimpleHandSound.cs- Basic hand tracking with Unity XR
Features:
- Real-time hand position tracking
- Velocity-based sound triggering
- Gesture-based sound selection
- Position-based sound zones
- 3D spatial audio
The system integrates EEG-based brain-computer interface using MindRove hardware:
Standalone project for EEG signal processing and visualization with VFX integration.
Key Components:
- BrainWaveAnalyzer.cs - Core signal processing engine
- mindroveTest02.cs - Real-time EEG visualization
- VFX Effects - Visual feedback based on brain states
The main multiplayer client application that players run.
Key Components:
- NetworkManager.cs - Singleton client networking using RiptideNetworking
- TCP/UDP hybrid messaging
- Configurable IP/port connection parameters
- Connection state management (Connected, Failed, Disconnected)
- UiManager.cs - UI controller for connection interface
- Username input handling
- Connection button functionality
- MindRove Integration - EEG data collection scripts
- Real-time streaming from MindRove WiFi board
- CSV data logging at 450kHz sampling rate
The dedicated server managing game sessions and player synchronization.
Key Components:
- NetworkManager.cs - Singleton server networking
- Tick-based update loop
- Configurable max client count
- Client connection management
- Player.cs - Server-side player representation
- Dictionary-based player list
- Message handlers for player data
Unity receives and visualizes the OSC data through several C# components:
-
MidiVisualizer.cs- Main MIDI/audio visualization system- Creates particle effects based on audio data
- Color-codes notes by pitch (red=low, blue=high)
- Particle size based on velocity/amplitude
- Supports real-time playback from WAV files
- Processes audio from both WAV files and hand-generated sounds
-
HipHopVisualizer.cs- Genre-specific visualizer- Responds to audio frequency bands
- Separate particle systems for different frequency ranges
- Color-coded by frequency/instrument type
-
OSCReceiver.cs- (Historical) Receives OSC messages on port 12345- Used in earlier Sonic Pi implementation
- Parses OSC protocol messages
- Thread-safe message queue
- Event system for message distribution
-
Hand-Tracking Components:
A comprehensive hand-tracking controller specifically designed for Meta Quest 3, using the Oculus Interaction SDK. This is the most feature-rich implementation.
Key Features:
- Dual Hand Support: Independent tracking and audio for left and right hands
- Movement-Based Sound: Triggers sounds based on hand velocity (configurable min/max velocity)
- Dynamic Audio Parameters:
- Pitch modulation (0.8x to 1.2x) based on movement speed
- Volume scaling (0.3 to 1.0) based on movement speed
- Random sound clip selection from an array
- Gesture Detection Framework: Infrastructure for recognizing hand gestures (fist, open hand, pointing, thumbs up)
- Position-Based Sound Zones: Define spatial zones that trigger specific sounds when hands enter them
- 3D Spatial Audio: Full 3D audio positioning for immersive sound experience
- Cooldown System: Prevents sound spam with configurable cooldown periods
Dependencies:
Oculus.InteractionnamespaceOculus.Interaction.Inputnamespace- Oculus Integration package
Configuration:
- Assign
Handobjects for left and right hands - Configure separate
AudioSourcecomponents for each hand - Set velocity thresholds (
minVelocity,maxVelocity) - Define sound zones using colliders and
SoundZoneclass - Enable/disable gesture detection and position zones
Use Case: Best for Quest 3 VR experiences requiring advanced hand tracking with multiple interaction modes.
Generates procedural sounds in real-time using mathematical synthesis, creating pure sine wave tones based on hand movement.
Key Features:
- Procedural Synthesis: Generates sounds algorithmically using sine waves
- Frequency Modulation: Base frequency (default 440Hz/A4) scales with hand speed
- Dynamic Frequency Range: Configurable frequency multiplier (default 2x range)
- Envelope Shaping: Sine wave envelope for smooth fade in/out
- Speed-Based Generation: Faster hand movements = higher frequency tones
- Minimal Dependencies: Works with simple Transform references (no VR SDK required)
How It Works:
- Tracks hand position changes over time
- Calculates velocity from position delta
- Generates sine wave tone with frequency proportional to speed
- Applies envelope for natural sound decay
- Plays generated audio clip
Configuration:
- Assign left/right hand
Transformcomponents - Set base frequency (Hz) - default 440Hz (A4 note)
- Configure frequency range multiplier
- Set minimum speed threshold to trigger sounds
- Adjust sound duration and cooldown period
Use Case: Ideal for projects requiring pure synthesized tones without pre-recorded audio clips, or when you want mathematical precision in sound generation.
A lightweight, cross-platform hand tracking solution using Unity's built-in XR system. Works with OpenXR and Oculus Integration.
Key Features:
- Cross-Platform XR: Uses Unity XR (
UnityEngine.XR) for broad compatibility - OpenXR Support: Works with OpenXR standard (supports multiple VR platforms)
- Simple Configuration: Minimal setup required
- Pre-recorded Audio: Plays audio clips from an array
- Pitch Variation: Adjusts pitch based on hand speed (configurable range)
- Selective Tracking: Can enable/disable left or right hand tracking independently
How It Works:
- Uses
InputDevices.GetDeviceAtXRNode()to get hand position - Calculates velocity from position changes
- Selects random audio clip from array
- Modulates pitch based on speed (lerps between min/max pitch)
- Plays sound with cooldown protection
Configuration:
- Assign
AudioSourcecomponent - Add audio clips to the
soundClipsarray - Set minimum speed threshold
- Configure pitch range (default 0.8 to 1.2)
- Enable/disable left/right hand tracking
Dependencies:
- Unity XR package
- OpenXR or Oculus Integration (depending on platform)
Use Case: Best for projects requiring cross-platform VR support or when you want a simple, lightweight solution without Quest 3-specific features.
Comparison Table:
Feature Quest3HandSoundController ProceduralHandSound SimpleHandSound Platform Quest 3 (Oculus SDK) Any (Transform-based) Cross-platform XR Sound Source Audio clips Procedural synthesis Audio clips Gesture Support Yes (framework) No No Position Zones Yes No No Spatial Audio Yes (3D) Yes (3D) Yes (3D) Dependencies Oculus Interaction None Unity XR Complexity High Medium Low
The BrainWaveAnalyzer class provides comprehensive EEG signal analysis:
| Band | Frequency Range | Mental State |
|---|---|---|
| Delta | 0.5 - 4 Hz | Deep sleep |
| Theta | 4 - 8 Hz | Drowsiness, meditation |
| Alpha | 8 - 13 Hz | Relaxed, eyes closed |
| Beta | 13 - 30 Hz | Active thinking, focus |
| Gamma | 30 - 45 Hz | High-level cognitive processing |
- Relaxation Score = Alpha / Theta ratio
- Attention Score = Beta / Theta ratio
- Total Power = Sum of all band powers
- DC Offset Removal - Mean subtraction
- Windowing - Hanning window application (reduces spectral leakage)
- FFT Analysis - Cooley-Tukey radix-2 FFT implementation
- PSD Computation - Power Spectral Density calculation
- Band Power Extraction - Frequency band isolation
- Normalization - Relative power computation
The system uses accelerometer data from the MindRove device to detect and filter out motion-corrupted EEG samples, ensuring clean brain signal analysis.
Each musical note triggers the creation of particles in Unity:
-
Color Mapping: Notes are mapped to colors based on pitch
- Low notes (C2-C4): Red to Yellow gradient
- Mid notes (C4-C5): Yellow to Green gradient
- High notes (C5-C7): Green to Blue gradient
-
Size Mapping: Particle size corresponds to note amplitude/velocity
- Louder notes = Larger particles
- Formula:
particleSize = baseSize + (velocity / 127) * multiplier
-
Particle Systems: Multiple particle systems handle different note ranges
- Main particle system for mid-range notes
- High note particles (C5 and above)
- Low note particles (below C4)
Each brain wave type is represented with different colors and interactions:
- Different particle colors for alpha, beta, theta, gamma, delta waves
- Particle system interactions show synchronicity between users
- VFX Graph integration for complex brain state visualization
- Real-time Response: Particles appear instantly as notes are played
- Color Gradients: Smooth color transitions based on musical pitch
- Dynamic Sizing: Particles scale with musical dynamics
- Multiple Systems: Different particle systems for different musical elements (melody, bass, beats, hi-hats)
- Brain Synchronicity Display: Two particle systems interact to show brain synchronization between users
/Assets/MidiVisualizer.cs- Main audio/MIDI visualization/Assets/HipHopVisualizer.cs- Genre-specific music visualizer/Assets/MidiParser.cs- MIDI file parser/Assets/MidiAudioPlayer.cs- Audio playback from MIDI
Quest3HandSoundController.cs- Advanced Quest 3 hand tracking with gestures and position zones (see Hand-Tracking Components for details)ProceduralHandSound.cs- Procedural tone generation from hand movements using sine wave synthesis (see Hand-Tracking Components for details)SimpleHandSound.cs- Cross-platform XR hand tracking with Unity XR (see Hand-Tracking Components for details)
BrainWaveAnalyzer.cs- FFT & frequency analysis for brain wavesmindroveTest02.cs- Real-time EEG visualizationParticleSysthemes/VFX_Test01.vfx- VFX assets
Client_Between-US_EEG/Assets/script/
├── multiplayer/
│ ├── NetworkManager.cs # Client networking
│ ├── UiManager.cs # Connection UI
│ ├── Player.cs # Player representation
│ ├── GameLogic.cs # Game mechanics
│ └── MessageExtension.cs # Message utilities
└── mindRover/
├── NewBehaviour.cs # EEG data collector
└── mindroveTest02.cs # EEG streaming test
Server_Between-US_EEG/Assets/script/
└── multiplayer/
├── NetworkManager.cs # Server networking
├── Player.cs # Server-side player
├── GameLogic.cs # Server game logic
└── MessageExtension.cs # Message utilities
sonicpi_osc_music.rb- Complete music example with OSCsonicpi_osc_simple.rb- Simple OSC testsonicpi_osc_example.rb- Basic OSC exampleAssets/OSCReceiver.cs- OSC message receiver (historical)- Genre-specific Sonic Pi scripts in root directory
- Model: MindRove WiFi Board
- SDK Version: 5.2.4
- Features:
- Multiple EEG channels
- Built-in accelerometer (motion detection)
- WiFi connectivity
- High sampling rate support
- Meta Quest 3 or compatible VR device for hand tracking
- OpenXR compatible headset for cross-platform support
| Component | Technology | Purpose |
|---|---|---|
| Game Engine | Unity 2022+ | Core platform |
| Networking | RiptideNetworking v5.x | Client-server communication |
| EEG SDK | MindRove 5.2.4 | Brain-computer interface |
| Signal Processing | Custom FFT + Accord.Math | Frequency analysis |
| UI | TextMesh Pro + UGUI | User interface |
| Graphics | Visual Effect Graph | VFX feedback |
| Rendering | DirectX 12 | Windows rendering |
| VR/AR | Meta Quest 3, OpenXR | Hand tracking |
| Audio | Unity AudioSource | WAV file playback and analysis |
Note: All network project components have been set aside to prioritize EEG data collection
Uses enum-based message IDs for type-safe communication:
public enum ClientToServerId : ushort
{
name = 1,
// Additional message types...
}- Client initiates connection to server (IP:Port)
- On successful connection, client sends username
- Server creates player instance and adds to player list
- Game state synchronization begins
The system generates several log files:
| File | Content |
|---|---|
mindrove_log.txt |
SDK library logs |
mindrove_data_log.txt |
Real-time EEG data stream |
mindrove_data.csv |
Raw EEG samples (CSV format) |
- Unity - Unity 2022 or later
- Audio Files - WAV files for visualization (optional)
- VR Headset (Optional) - Meta Quest 3 or compatible VR device for hand tracking
- MindRove EEG headset with SDK 5.2.4 (for brain-computer interface)
- Windows 10/11 (DirectX 12 support)
-
Open Unity Project
- Open the
myproj6Unity project - Ensure visualization scripts are in the Assets folder
- Open the
-
Add Audio File
- Place your WAV file in
Assets/StreamingAssets/folder - Create the folder if it doesn't exist
- Place your WAV file in
-
Configure Audio Source
- Create an empty GameObject in your Unity scene
- Add an
AudioSourcecomponent - Assign your WAV file to the Audio Clip
- Enable "Play On Awake" if desired
-
Add Visualizer
- Add
MidiVisualizerorHipHopVisualizercomponent to the GameObject - Configure particle settings as desired
- Link the AudioSource to the visualizer
- Add
-
Run the Scene
- Press Play in Unity
- Watch the particles respond to the audio!
-
Configure Hand Tracking
- Add
Quest3HandSoundControllerorProceduralHandSoundcomponent - Assign hand tracking references (left/right hand transforms)
- Configure audio sources for hand-generated sounds
- Add
-
Set Up VR (if using Quest 3)
- Import Oculus Integration package
- Configure OVRHand or Oculus Interaction SDK
- Assign hand references to the controller script
-
Combine with WAV Visualization
- Both WAV file audio and hand-generated sounds will create particles
- Particles are color-coded and sized based on audio characteristics
- Open
Server_Between-US_EEGproject in Unity - Configure port in NetworkManager
- Build or run in editor
- Open
Client_Between-US_EEGproject in Unity - Set server IP address in NetworkManager
- Connect MindRove device
- Build or run in editor
- Enter username and connect
- Open
concentrationproject in Unity - Connect MindRove device
- Run scene to see real-time brainwave visualization
- Check Unity Console for audio analysis data
- Verify particles are being created in the Unity scene
- Move your hands (if using hand tracking) to generate sounds and particles
- Adjust particle settings in the Inspector if needed
- Verify EEG data is being received (check console logs)
The system analyzes audio in real-time to extract:
- Frequency Data: Pitch information for color mapping
- Amplitude Data: Volume information for particle size
- Spectral Analysis: Frequency bands for different particle systems
Unity's audio system provides:
- Real-time spectrum analysis via
AudioSource.GetSpectrumData() - Amplitude data from audio samples
- Frequency domain analysis for pitch detection
Hand movements generate procedural sounds:
- Velocity-Based: Faster movements = higher pitch/volume
- Position-Based: Different hand positions trigger different sounds
- Gesture-Based: Specific hand shapes produce unique sounds
- 3D Spatial Audio: Sounds are positioned in 3D space
The visualization uses Unity's built-in Particle System with:
- Simulation Space: World space
- Max Particles: 1000 (configurable)
- Lifetime: 2 seconds (configurable)
- Velocity: Upward movement (Y = 3 units/second)
- Basic client-server networking framework
- EEG signal processing pipeline
- Real-time brainwave analysis (all 5 bands)
- Motion artifact detection
- VFX integration
- Data logging system
- WAV file visualization
- Hand-tracking instrumentation
- Complete game mechanics
- Player synchronization
- Multiple game modes
- Check that audio is playing (verify AudioSource is active)
- Ensure the visualizer component is attached to a GameObject
- Check that particle systems are enabled and playing
- Verify audio file is loaded correctly (check AudioSource clip assignment)
- Enable debug logging to see audio analysis data
- Verify WAV file is in
StreamingAssetsfolder - Check AudioSource component settings
- Ensure audio file format is supported (WAV, MP3, OGG)
- Check Unity Console for audio loading errors
- Verify VR headset is connected and tracking
- Check hand tracking references are assigned in Inspector
- Ensure Oculus Integration or OpenXR is properly configured
- Check that hand tracking is enabled in VR settings
- Verify hand transforms are being updated (check Inspector during play)
- Check audio analysis is running (enable debug logging)
- Verify frequency/amplitude values are in expected ranges
- Ensure visualizer is receiving audio data
- Check particle system settings (emission rate, size, color)
- Verify MindRove device is connected via WiFi
- Check SDK version (5.2.4 required)
- Ensure correct IP/port configuration
- Check
mindrove_log.txtfor SDK errors - Verify accelerometer motion filtering is not blocking all data
Potential improvements for the system:
- Advanced FFT analysis for more accurate frequency detection
- Machine learning-based gesture recognition
- Multi-user hand-tracking collaboration
- Advanced particle effects (trails, meshes, custom shapes)
- Real-time audio effects processing
- Network synchronization for multi-user experiences
- Complete game mechanics integration
- Multiple game modes
- Enhanced brain synchronicity visualization
- MindRove: https://mindrove.com/downloads/
This project is licensed under the MIT License.
Copyright (c) 2024
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Project Lead
- Simeone Scaramozzino
- Concept, system architecture, and integration
Team
- Barthelemy Martel
- Isabel Fowlkes
- Nicholas Fowlkes
- Sounak Ghosh
Quest Team
- Julian Camilo Mora Valbuena
- Juan Esteban Rodriguez Ospino
- Sergio Andres Arias Rodriguez
- Daniel Alejandro Ocampo Lozano
Data / Music
- Ria Baldevia
- Sonic Pi - Created by Sam Aaron for live coding music (used in early development)
- Unity Technologies - Game engine, particle system, and audio analysis
- Meta/Oculus - VR hand tracking technology
- OSC Protocol - Open Sound Control specification (used in early development)
- MindRove - EEG hardware and SDK
For detailed technical setup instructions, see myproj6/README.md