Skip to content

JohnBacho/VIBES-Lab-Project1

Repository files navigation

Virtual Immersive Behavioral Sciences (VIBES) Lab

VIBES Lab Logo

License: CC BY-NC 4.0 Unity Version Platform

📖 Overview

The Virtual Immersive Behavioral Sciences (VIBES) Lab is a cutting-edge collaborative research initiative at Baldwin Wallace University, bringing together students from:

  • 🧠 Psychology Department
  • 🔬 Neuroscience Department
  • 💻 Computer Science Department

Mission

We develop high-fidelity VR experimental environments that enable researchers to investigate psychological and neuroscientific phenomena in controlled, immersive settings with precise measurement and analysis of behavioral and visual responses.


✨ Key Features

🏞️ Realistic 3D Environments

Immersive virtual worlds built in Unity using professional asset packs for maximum ecological validity.

Screenshot 1 Screenshot 2

👁️ Advanced Eye & Camera Tracking

  • HTC VIVE Pro Eye with Tobii integration
  • High-resolution gaze and head-tracking via Custom Scripts & SRanipal + SimpleOmnia
  • Real-time data collection and synchronization

⚙️ SimpleOmnia Integration

Powerful suite by Justin Kasowski that streamlines:

  • Data collection workflows
  • Event timing precision
  • VR interaction logging

🧠 Project 1 Behavioral Paradigm

For Project 1, participants engage in a single integrated experimental environment that allows researchers to examine:

  • Development of emotional responses to aversive stimuli
  • Reduction or modulation of emotional responses over time
  • Patterns of visual attention and behavioral reactions during these experiences

📊 Comprehensive Data Analysis

Python-based toolkit for:

  • Gaze data mapping onto virtual environments
  • Fixation and saccade pattern extraction
  • Behavioral response visualization across trials

🌐 Web-Based Data Processing

Access our online tool: VIBES Lab CSV Formatter


👁️ Eye-Tracking Systems

The VIBES Lab supports two major eye-tracking configurations:

🔷 HTC VIVE Pro Eye (Primary System)

Tobii-powered eye tracking integrated into the HMD, accessed through:

  • SRanipal SDK for real-time gaze data
  • SimpleOmnia for synchronized event tracking

Captured Data:

  • Gaze origin & direction vectors
  • Combined & per-eye tracking
  • Eye openness metrics
  • Pupil diameter
  • Blink detection
  • Game Object being looked at

🔶 Tobii External Eye Trackers (Latest Stable Release)

Support for standalone Tobii devices:

  • All Tobii Eye Trackers
  • Verified on Tobii Nano

Captured Data:

  • Gaze origin & direction vectors
  • Combined & per-eye tracking
  • Pupil diameter
  • Game Object being looked at

Note: Tobii External Eye Trackers support is available only in the latest stable release. Please contact for code.

IMG_0024 IMG_0028


🛠️ Technology Stack

Component Technology
Game Engine Unity 2023.1.5f1
VR Hardware HTC VIVE Pro Eye
Flat Panel Tobii Nano
Data Collection SimpleOmnia
SDKs SteamVR, SRanipal, Tobii
Programming C#, Python
Analysis Python (Pandas, Matplotlib)
diagram

📥 Installation & Setup

Prerequisites

  • Unity 2023.1.5f1 or newer
  • HTC VIVE Pro Eye or compatible Tobii device
  • SteamVR installed and configured
  • Git installed on your system

Step 1: Clone the Repository

git clone https://github.com/JohnBacho/VIBES-LAB-Project1.git
cd VIBES-LAB-Project1

Step 2: Unity Configuration

  1. Open Unity Hub and add the project
  2. Ensure Unity 2023.1.5f1 is installed
  3. Open the project and wait for initial import
  4. Download SteamVR
  5. Follow SimpleOmnia installation instructions (included in project)

Step 3: Eye-Tracking Setup

For HTC VIVE Pro Eye:

  1. Install SRanipal SDK (included in project)
  2. Install SRanipal runtime to calibrate VR eye tracking

For Tobii Devices:

  1. Download the latest stable release from the Releases page
  2. Ensure tobii device is connected and recognized

Step 4: Verify Installation

  1. Run the demo scene in Unity
  2. Check console for successful SDK initialization
  3. Verify eye-tracking data is being recorded

📚 Documentation

For detailed documentation, please visit our Wiki.


🤝 Contributing

We welcome contributions from the research community! Please see our Contributing Guidelines for more information.


👥 Core Team

Name Major
Dr. Brian Thomas Professor of Psycology
John Bacho Computer Science
Lauren Dunlap Computer Science
Albert Selby Computer Science / Data Science
Marissa Brigger Neuroscience
Alexa Gossett Neuroscience / Psychology
Jace Lander Software Engineer
VIBES Group

🙏 Acknowledgments

  • Justin Kasowski – Creator of SimpleOmnia
  • Unity Asset Store – For high-quality 3D environmental assets
  • Baldwin Wallace University – For institutional support and resources
  • HTC Vive & Tobii – For technical documentation and SDK support

📄 License

This project is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License.

Includes SimpleOmnia by Justin Kasowski, licensed under CC BY-NC 4.0.

For more details, see: https://creativecommons.org/licenses/by-nc/4.0/


📧 Contact

For questions, collaborations, or support:


🔗 Related Projects


Made with ❤️ by the VIBES Lab Team

About

Made for academia, a Unity VR project developed for neuroscience and psychology research, focusing on fear conditioning in VR.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Contributors