Skip to content

QinjieXiao/EyelashNet

Repository files navigation

EyelashNet: A Dataset and A Baseline Method for Eyelash Matting

Official repository for EyelashNet: A Dataset and A Baseline Method for Eyelash Matting appeared in SIGGRAPH ASIA 2021.

Eyelash matting results on daily-captured images from Unsplash.

Requirements

Packages:

  • pytorch>=1.2.0
  • torchvision>=0.4.0
  • tensorboardX
  • numpy
  • opencv-python
  • toml
  • easydict
  • pprint
  • rmi-pytorch

Models

Model Name Training Data File Size MSE SAD Grad Conn
ResNet34_En_nomixup ISLVRC 2012 166MB N/A N/A N/A N/A
RenderEyelashNet Render eyelashes 288MB - - - -
EyelashNet Capture eyelashes 288MB - - - -
  • ResNet34_En_nomixup: Model of the customized ResNet-34 backbone trained on ImageNet. Save to ./pretrain/. Please refer to GCA-Matting for more details.
  • RenderEyelashNet: Model trained on the rendered eyelash data. Save to ./checkpoints/RenderEyelashNet/.
  • EyelashNet: Model trained on the captured eyelash data. Save to ./checkpoints/EyelashNet/.

Train and Evaluate on EyelashNet

Data Preparation

Download BaselineTestDataset, EyelashNet , pupil_bg in ./data/ folder.

Download the coco dataset in ./data/coco_bg/ folder.

Configuration

TOML files are used as configurations in ./config/. You can find the definition and options in ./utils/config.py.

Set ROOT_PATH = "<path to code folder>/" in ./root_path.py.

Run a demo

python eyelash_test.py --config=./config/EyelashNet.toml --checkpoint=checkpoints/EyelashNet/best_model.pth --image-dir=<path to  eyelash image folder>  --output=<path to output folder>

Training

We utilize a desktop PC with single NVIDIA GTX 2080 (8GB memory), Intel Xeon 3.6 GHz CPU, and 16GB RAM to train the network. First, you need to set your training and validation data path in configuration (e.g., ./config/EyelashNet.toml):

[data]
train_fg = ""
train_alpha = ""
train_bg = ""
test_merged = ""
test_alpha = ""
test_trimap = ""

You can train the model by

./train.sh

Evaluation

To evaluate the trained model on BaselineTestDataset, set the path of BaselineTestDataset testing and model name in the configuration file *.toml:

[test]
merged = "./data/BaselineTestDataset/image"
alpha = "./data/BaselineTestDataset/mask"
# this will load ./checkpoint/*/best_model.pth
checkpoint = "best_model" 

and run the command:

./test.sh

Agreement

The code, pretrained models and dataset are available for non-commercial research purposes only.

Acknowledgments

This code borrows heavily from GCA-Matting.

@inproceedings{li2020natural,
  title={Natural image matting via guided contextual attention},
  author={Li, Yaoyi and Lu, Hongtao},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  volume={34},
  pages={11450--11457},
  year={2020}
}

Citation

If you find this work, code or dataset useful for your research, please cite:

@article{10.1145/3478513.3480540,
author = {Xiao, Qinjie and Zhang, Hanyuan and Zhang, Zhaorui and Wu, Yiqian and Wang, Luyuan and Jin, Xiaogang and Jiang, Xinwei and Yang, Yong-Liang and Shao, Tianjia and Zhou, Kun},
title = {EyelashNet: A Dataset and a Baseline Method for Eyelash Matting},
year = {2021},
issue_date = {December 2021},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
volume = {40},
number = {6},
issn = {0730-0301},
url = {https://doi.org/10.1145/3478513.3480540},
doi = {10.1145/3478513.3480540},
journal = {ACM Trans. Graph.},
month = {dec},
articleno = {217},
numpages = {17},
keywords = {dataset, deep learning, eyelash matting}
}

Contact

qinjie_xiao@zju.edu.cn

jin@cad.zju.edu.cn

About

Official repository for EyelashNet: A Dataset and A Baseline Method for Eyelash Matting

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published