Skip to content

siddhant61/StreamSense

Repository files navigation

🧠 StreamSense

Multi-Device Physiological Recording Platform

Synchronized recording and analysis of physiological signals from multiple devices

Python 3.8+ Platform License LSL

Features β€’ Screenshots β€’ Quick Start β€’ Documentation β€’ Architecture


πŸ“– Overview

StreamSense is a professional multi-device physiological recording platform designed for social neuroscience research. It enables synchronized data acquisition from multiple participants and devices simultaneously, with microsecond-precision timestamps for accurate cross-device correlation analysis.

Key Capabilities

  • πŸ”— Multi-Device Synchronization - Record from multiple people simultaneously with LSL timestamps
  • 🎯 Multi-Vendor Support - Muse (EEG), Empatica E4 (wrist), BITalino (multi-sensor)
  • πŸ’» Professional UI - Beautiful PyQt5 dashboard for easy device management
  • πŸ“Š Real-Time Streaming - Live LSL stream monitoring and visualization
  • πŸ”„ Robust Architecture - Process-based isolation, automatic reconnection, crash recovery
  • πŸ“ˆ Analysis-Ready Output - MNE format for EEG, pandas DataFrames for other signals

Use Cases

  • πŸ‘₯ Social Neuroscience - Measure physiological synchrony between people (couples, teams, groups)
  • 🧘 Meditation Research - Multi-person brain synchronization during meditation
  • 🎡 Music Studies - Physiological coordination in musical ensembles
  • πŸ’‘ Relationship Research - Heart rate and brain wave synchronization in couples
  • πŸ₯ Clinical Applications - Multi-modal physiological monitoring

✨ Features

Device Support

Device Sensors Sampling Rates Connection
Muse 2/S EEG (4ch), PPG, ACC, GYRO 256Hz EEG, 64Hz PPG Bluetooth LE
Empatica E4 BVP, GSR, TEMP, ACC 64Hz BVP, 4Hz GSR WiFi/BLE Server
BITalino ECG, EDA, EMG, EEG, ACC Up to 1000Hz Bluetooth/Serial

Core Features

βœ… Professional UI Dashboard

  • Device discovery and management
  • One-click connect/disconnect
  • Real-time status monitoring
  • Signal quality indicators
  • Recording controls with live timer

βœ… Multi-Device Recording

  • Synchronized timestamps across all devices (LSL)
  • Individual device processes for crash isolation
  • Automatic reconnection with exponential backoff
  • Intelligent data interpolation for brief disconnections

βœ… Lab Streaming Layer (LSL) Integration

  • Industry-standard protocol for physiological data
  • Network time protocol synchronization (microsecond precision)
  • Compatible with all major analysis tools (MNE, EEGLAB, etc.)
  • XDF format for multi-stream recordings

βœ… Extensible Architecture

  • BaseStreamer abstract class for easy device addition
  • Process-based isolation for reliability
  • Clean MVC pattern (UI ↔ Controller ↔ Core)
  • Comprehensive documentation and guides

πŸ“Έ Screenshots

Professional UI Dashboard

Beautiful dark-themed interface for device management and recording

Initial State Clean initial state ready for device discovery


Devices Discovered Multiple devices discovered and ready to connect


Device Connected Muse headband connected with signal quality indicator


Multiple Devices Multiple devices streaming simultaneously


LSL Streams Live LSL streams from all connected devices


Recording Active Recording session in progress with live duration timer


Full Overview Complete UI showing all features in action


πŸš€ Quick Start

Installation

  1. Clone the repository

    git clone https://github.com/siddhant61/StreamSense.git
    cd StreamSense
  2. Create Python environment (Python 3.8+ required)

    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
  3. Install dependencies

    pip install -r requirements.txt

Launch the UI

python ui/streamsense_ui.py

Basic Workflow

  1. Discover Devices - Click "πŸ” Discover Devices"
  2. Connect - Click "Connect" on any device card
  3. Monitor Streams - Watch live streams appear in the right panel
  4. Record - Click "● Start Recording" to begin
  5. Stop - Click "β–  Stop Recording" when finished

Output Location: Documents/StreamSense/[timestamp]/

  • RawData/ - HDF5 files with raw sensor data
  • Dataset/ - Processed data (MNE format for EEG, pandas for others)

πŸ“š Documentation

Comprehensive guides are available in the docs/ directory:


πŸ—οΈ Architecture

System Overview

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                    StreamSense UI (PyQt5)                   β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     β”‚
β”‚  β”‚   Device     β”‚  β”‚   Recording  β”‚  β”‚   Stream     β”‚     β”‚
β”‚  β”‚   Controls   β”‚  β”‚   Controls   β”‚  β”‚   Monitor    β”‚     β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜     β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                            β”‚ Qt Signals/Slots
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚             StreamSenseController (Business Logic)          β”‚
β”‚  β€’ Device Discovery  β€’ Connection Management  β€’ Recording   β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                            β”‚ Direct API Calls
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                    Core Components                          β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     β”‚
β”‚  β”‚  FindDevices β”‚  β”‚ BaseStreamer β”‚  β”‚StreamRecorderβ”‚     β”‚
│  │   (Scanner)  │  │  (Abstract)  │  │   (LSL→HDF5) │     │
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜     β”‚
β”‚         β”‚                  β”‚                    β”‚           β”‚
β”‚         β”‚        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”         β”‚           β”‚
β”‚         β”‚        β”‚    Streamers      β”‚         β”‚           β”‚
β”‚         β”‚        β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€         β”‚           β”‚
β”‚         β”‚        β”‚  StreamMuse       β”‚         β”‚           β”‚
β”‚         β”‚        β”‚  StreamE4         β”‚         β”‚           β”‚
β”‚         β”‚        β”‚  StreamBioTalino  β”‚         β”‚           β”‚
β”‚         β”‚        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜         β”‚           β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
          β”‚                  β”‚                   β”‚
          β”‚                  β–Ό                   β”‚
          β”‚      Lab Streaming Layer (LSL)      β”‚
          β”‚     Network Time Synchronization    β”‚
          β”‚                  β”‚                   β”‚
          β”‚                  β–Ό                   β–Ό
          β–Ό         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚  Muse Devices  β”‚  β”‚ XDF Recordings β”‚
  β”‚   Empatica   β”‚  β”‚  E4 Devices    β”‚  β”‚ HDF5 Raw Data  β”‚
  β”‚  BLE Server  β”‚  β”‚  BITalino      β”‚  β”‚ Pickle Datasetsβ”‚
  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Key Design Patterns

1. BaseStreamer Architecture

  • Abstract base class for all device streamers
  • Process-based isolation (multiprocessing.Process)
  • Standardized lifecycle: start β†’ stream β†’ stop
  • Event-based synchronization (multiprocessing.Event)

2. MVC Pattern

  • View: PyQt5 UI (streamsense_ui.py)
  • Controller: Business logic (streamsense_controller.py)
  • Model: Core components (streamers, recorder, finder)

3. Thread Safety

  • Qt signals for cross-thread communication
  • Background threads for blocking operations (discovery, connection)
  • Main thread reserved for UI updates

4. Process Isolation

  • Each device runs in separate Process
  • Crash in one device doesn't affect others
  • True parallelism for CPU-intensive operations

πŸ’» Command-Line Interface

For advanced users, StreamSense also provides a powerful CLI:

python main.py

CLI Commands

> menu          # Interactive menu mode
> stream --dev muse   # Stream from Muse devices
> stream --dev e4     # Stream from E4 devices
> view --data eeg     # View EEG streams
> record              # Start recording
> stop                # Stop all streams

Menu Options:

  1. Connect and stream Muse devices
  2. View all active LSL streams
  3. Connect and stream E4 devices
  4. Start recording all streams
  5. Run visual oddball paradigm
  6. Start event logger console
  7. Stop all active LSL streams

πŸ”¬ Research Applications

Example: Measuring Couple's Heart Synchrony

import pyxdf
import numpy as np
from scipy import signal

# Load synchronized recording
streams, header = pyxdf.load_xdf('recording.xdf')

# Extract PPG for both participants
participant_a = [s for s in streams if 'Muse-A_PPG' in s['info']['name'][0]][0]
participant_b = [s for s in streams if 'Muse-B_PPG' in s['info']['name'][0]][0]

# Compute cross-correlation
correlation = signal.correlate(ppg_a, ppg_b, mode='full')
lags = signal.correlation_lags(len(ppg_a), len(ppg_b), mode='full')

# Find peak synchrony
peak_lag = lags[np.argmax(correlation)]
print(f"Peak synchrony at lag: {peak_lag} samples ({peak_lag/64:.2f} seconds)")

See docs/MULTI_DEVICE_SYNCHRONIZATION_GUIDE.md for complete examples.


πŸ› οΈ Platform Support

Platform Status Notes
Windows 10/11 βœ… Full Support All features available
macOS ⚠️ Partial UI works, some device drivers limited
Linux ⚠️ Partial UI works, E4 server not available

Platform-Specific Requirements

Windows:

  • Empatica BLE Server (for E4 devices)
  • BLED112 dongle drivers (for Muse devices)

macOS/Linux:

  • Core UI and recording features work
  • Muse support via native Bluetooth LE
  • E4 requires Windows or virtual machine

πŸ“¦ Repository Structure

StreamSense/
β”œβ”€β”€ ui/                          # Professional UI dashboard
β”‚   β”œβ”€β”€ streamsense_ui.py       # PyQt5 interface
β”‚   └── streamsense_controller.py  # Backend controller
β”œβ”€β”€ streamer/                    # Device streamers
β”‚   β”œβ”€β”€ base_streamer.py        # Abstract base class
β”‚   β”œβ”€β”€ stream_muse.py          # Muse headband streamer
β”‚   β”œβ”€β”€ stream_e4.py            # Empatica E4 streamer
β”‚   └── stream_bitalino.py      # BITalino streamer
β”œβ”€β”€ recorder/                    # Recording logic
β”‚   └── stream_recorder.py      # LSL β†’ HDF5 recorder
β”œβ”€β”€ helper/                      # Utilities
β”‚   β”œβ”€β”€ find_devices.py         # Device discovery
β”‚   └── e4_helper.py            # E4-specific helpers
β”œβ”€β”€ viewer/                      # Stream visualization
β”‚   └── view_streams.py         # Vispy-based viewer
β”œβ”€β”€ experiments/                 # Experimental protocols
β”‚   └── visual_oddball.py       # Visual oddball paradigm
β”œβ”€β”€ docs/                        # Documentation
β”‚   β”œβ”€β”€ UI_QUICK_START.md       # UI usage guide
β”‚   β”œβ”€β”€ MULTI_DEVICE_SYNCHRONIZATION_GUIDE.md
β”‚   β”œβ”€β”€ DEVICE_SUPPORT_ROADMAP.md
β”‚   └── screenshots/            # UI screenshots
β”œβ”€β”€ tests/                       # Test suite
β”œβ”€β”€ audit/                       # Architecture analysis
β”œβ”€β”€ main.py                      # CLI entry point
└── requirements.txt             # Dependencies

πŸ§ͺ Development

Running Tests

pytest

Current test coverage:

  • βœ… BaseStreamer lifecycle (21 tests)
  • βœ… Muse streaming (8 tests)
  • βœ… E4 streaming (19 tests)
  • βœ… Data processing utilities

Adding New Devices

StreamSense makes it easy to add new devices:

  1. Inherit from BaseStreamer

    from streamer.base_streamer import BaseStreamer
    
    class StreamMyDevice(BaseStreamer):
        def __init__(self, device_id, synchronized_start_time, root_output_folder):
            super().__init__(
                device_name=f"MyDevice_{device_id}",
                synchronized_start_time=synchronized_start_time,
                root_output_folder=root_output_folder
            )
    
        def _stream_wrapper(self):
            # Main streaming logic
            pass
    
        def _setup_lsl_outlets(self):
            # Create LSL outlets
            pass
  2. Implement streaming logic - Connect to device, read samples, push to LSL

  3. Add to controller - Update streamsense_controller.py discovery and connection

  4. Write tests - Ensure reliability

See streamer/README.md for detailed guide.


🀝 Contributing

Contributions are welcome! Areas of interest:

  • πŸ”Œ New device support (Polar H10, Emotiv, OpenBCI, etc.)
  • πŸ“Š Real-time visualization (signal plots, synchrony graphs)
  • πŸ§ͺ Additional tests (integration tests, UI tests)
  • πŸ“– Documentation (tutorials, examples, translations)
  • πŸ› Bug fixes (especially cross-platform issues)

How to contribute:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

πŸ“ Citation

If you use StreamSense in your research, please cite:

@software{streamsense2025,
  title = {StreamSense: Multi-Device Physiological Recording Platform},
  author = {StreamSense Team},
  year = {2025},
  url = {https://github.com/siddhant61/StreamSense}
}

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.


πŸ™ Acknowledgments

  • Lab Streaming Layer (LSL) - Foundation for synchronized streaming
  • MNE-Python - EEG analysis tools
  • PyQt5 - Professional UI framework
  • Muse LSL - Muse device integration
  • Empatica - E4 device support
  • BITalino - Open-source biosignal platform

πŸ“¬ Contact & Support


Built with ❀️ for social neuroscience research

⭐ Star this repository if you find it useful! ⭐

⬆ Back to Top

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages