- Published at WACV2023!
- Authors: Swati Jindal, Xin Eric Wang
- Link to paper
We used Python 3.7.10 and torch 1.18.1 to test our experiments. We ran our codebase on Ubuntu 20.04.
To install all the packages:
pip install -r requirements.txt
Download the three datasets: GazeCapture, MPIIFaceGaze, Columbia.
To pre-process the datasets, please use this repository and follow instructions provided to generate eye-strip images for FAZE. Put the h5 files in the data folder.
Create a config json file similar to configs/config_gc_to_mpii.json describing all the training parameters and paths to the input files.
To train the task network, run this command:
python train_tasknet.py --config-json configs/config_tasknet.json
To train and evaluate the CUDA-GHR model in the paper, run this command:
GazeCapture → MPIIGaze:
python train_cudaghr.py --config_json configs/config_gc_to_mpii.json
GazeCapture → Columbia:
python train_cudaghr.py --config_json configs/config_gc_to_col.json --columbia
The training images, losses and evaluation metrics will be loggged in Tensorboard. We also save generated images in the save folder.
To evaluate CUDA-GHR model, run this command:
python eval_cudaghr.py --model_path <path to model> --config_json <path to config file> --test_people <subset to test>
Add '--columbia' option to test on Columbia dataset.
You can download pretrained models here:
The code is adapted from FAZE and STED-Gaze. We thank authors for their awesome work!!