Thanks for checking out our software! OWLET is designed to process infant gaze and looking behavior using webcam videos recorded on laptops or smartphones. Instructions for downloading and running the source code for OWLET is below. In addition, a user guide, which describes options for processing gaze data with OWLET in more detail, can be found at: https://denisewerchan.com/owlet. If you use this software in your research, please cite as:
Werchan, D. M., Thomason, M. E., & Brito, N. H. (2022). OWLET: An Automated, Open-Source Method for Infant Gaze Tracking using Smartphone and Webcam Recordings. Behavior Research Methods.OWLET analyzes pre-recorded webcam or smartphone videos to estimate where an infant was looking during a task. Here's what it does:
- Calibrates gaze
- Uses default settings based on prior data (Werchan et al., 2023)
- Or, uses a custom calibration video of the infant looking left, right, up, and down (if provided)
- Estimates gaze for each frame
- Determines where the infant was looking (x/y coordinates on the screen)
- Generates output
- Saves a CSV file with gaze data for every frame
- Includes which part of the screen the infant looked at:
left,right, oraway
OWLET also includes optional features that allow you to:
- Auto-detect task start time
- Matches the audio in the infant’s video with the task video to find where the task begins
- Integrate trial info
- Links frame-by-frame gaze with trial start times (if a
trials.csvfile is provided)
- Links frame-by-frame gaze with trial start times (if a
- Use custom AOIs
- Tags gaze data using custom areas of interest (if an
AOIs.csvfile is provided)
- Tags gaze data using custom areas of interest (if an
- Create an overlaid video to visualize gaze patterns
- Combines the infant’s video with the task video and overlays gaze points on it
- Open Terminal (Press ⌘ + Space, type “Terminal”, hit Enter).
- Copy and paste this line and press Enter:
git clone https://github.com/denisemw/OWLET.git
Homebrew is a package manager for Mac.
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"When it's done, run:
brew doctorMake sure it says "Your system is ready to brew."
These are tools that the Python packages need under the hood:
brew install cmake ffmpeg pkg-config
brew install libomp # Needed for `numba` and `scikit-learn` on some MacsWe recommend installing Python via Homebrew to avoid messing with the Mac system Python.
brew install pythonCreate and activate a virtual environment to keep dependencies isolated:
python -m venv owlet_env
source owlet_env/bin/activateWe recommend using the 'requirements.txt' file to install dependencies:
pip install --upgrade pip setuptools wheel
pip install -r requirements.txtIf you run into errors with dlib, run this:
brew install boost boost-python3And then retry the install with:
pip install dlibYou can use OWLET in two ways: with a simple graphical interface (GUI), or through the command line.
Just run this command:
python OWLET.pyA window will pop up where you can select the folder with your subject videos. You can also optionally add task info like:
- The task video that was shown to the subject
- A file with trial start times
- A file with Areas of Interest (AOIs)
Basic usage:
python OWLET.py --subject_video /path/to/subject/videosIf you also want to include task files (like a task video or AOIs), use:
python OWLET.py /path/to/subject/videos --experiment_info /path/to/experiment/folderMake sure the experiment folder contains your task video and/or CSV files (trials.csv, AOIs.csv)
OWLET lets you include extra files to help connect the gaze data with what was shown during the task. You can add the following files to an optional "task" folder:
-
Task Video (.mov or .mp4, max 30fps): If you include a video of the task, OWLET will overlay the infant’s gaze onto this video in the final annotated output.
-
Trial Info CSV (
trials.csv): This file should have aTimecolumn (start time of each trial or condition) and aLabelscolumn (name of each trial/condition). OWLET uses this to organize gaze data by trial or condition. -
AOIs CSV (
AOIs.csv): Use this to define custom Areas of Interest (AOIs) on the task video. The file should have columns forAOI,x1,y1,x2, andy2, assuming a 960x540 resolution. If you don’t include this, OWLET will use default AOIs: Left, Right, and Away.
If you include a task video, OWLET will try to automatically match the audio in the task video to the audio in the subject video. This trims the start of the subject video to sync with the task, so you don’t have to edit it manually.
However, if OWLET can’t find an audio match (e.g., the task start is missing, there’s no sound, or background noise is too loud), it will skip processing that video.
To override optional audio matching, use the '--override_audio_matching' flag:
python OWLET.py --subject_video /path/to/subject/videos --experiment_info /path/to/experiment/folder --override_audio_matchingBelow is an example of a Zoom video processed using OWLET:
OWLET works best with high quality videos, and some tips are shown below. In addition, you can alter videos in editing software (e.g., iMovie) to change the contrast/brightness or crop in on the subject’s face, which can improve performance for poor quality videos.
Distributed under the GNU General Public License v3.0. See LICENSE for more information.
Denise Werchan - denisewerchan.com – @DeniseWerchan – denise.werchan@nyulangone.org
Project Link: https://github.com/denisemw/OWLET