This is the official implementation for FG 2023 paper "CoNFies: Controllable Neural Face Avatars"
The codebase is based on CoNeRF implemente in JAX, building on JaxNeRF.
The code uses the same environment as CoNeRF. We test tested it using Python 3.8.
Set up an environment using Miniconda:
conda create --name XXX python=3.8
Install the required packages:
pip install -r requirements.txt
For more details, please refer to CoNeRF.
The dataset uses the same format as Nerfies for the image extraction and camera estimation.
The format of annotations is the same as CoNeRF. Annotation files include annotations.yml
, [frame_id].json
and mapping.yml
. Please refer to CoNeRF for more details. We use OpenFace to generate the keypoints and Facial Action Units.
After preparing a dataset, you can train using command similar to CoNeRF:
export DATASET_PATH=/path/to/dataset
export EXPERIMENT_PATH=/path/to/save/experiment/to
python train.py --base_folder $EXPERIMENT_PATH --gin_bindings="data_dir='$DATASET_PATH'" --gin_configs configs/baselines/ours.gin
After training the model, you can do rendering using:
python render_changing_attributes.py --base_folder $EXPERIMENT_PATH --gin_bindings="data_dir='$DATASET_PATH'" --gin_configs /path/to/experiment/config.gin
Please modify the 'attribute and mask' part in configs/baselines/ours.gin and line #366 mask_select in conerf/training.py according to the attribute number and mask number in your dataset.
If you find our work useful, please consider citing:
@article{yu2022confies,
title={CoNFies: Controllable Neural Face Avatars},
author={Yu, Heng and Niinuma, Koichiro and Jeni, Laszlo A},
journal={arXiv preprint arXiv:2211.08610},
year={2022}
}