Skip to content

edward3862/CariMe-pytorch

Repository files navigation

CariMe-pytorch

The official pytorch implementation of our TMM paper CariMe: Unpaired Caricature Generation with Multiple Exaggerations.

examples

CariMe: Unpaired Caricature Generation with Multiple Exaggerations

Zheng Gu, Chuanqi Dong, Jing Huo, Wenbin Li, and Yang Gao

Paper: https://ieeexplore.ieee.org/abstract/document/9454341/

Prerequisites

  • Python 3.6
  • Pytorch 1.5.1
  • scikit-image 0.17.2

Preparing Dataset

  • Get the Webcaricature dataset, unzip the dataset to the data folder and align the dataset by running the following script:
python alignment.py

Training

Train the Warper:

python train_warper.py

Train the Styler:

python train_styler.py

Testing

  • Test the Warper only:
python test_warper.py --scale 1.0
  • Test the Styler only:
python test_styler.py 
  • Generate caricatures with both exaggeration and style transfer:
python main_generate.py --model_path_warper pretrained/warper.pt --model_path_styler pretrained/styler.pt
  • Generate caricatures with both exaggeration and style transfer for a single image:
python main_generate_single_image.py --model_path_warper pretrained/warper.pt --model_path_styler pretrained/styler.pt --input_path images/Meg Ryan/P00015.jpg --generate_num 5 --scale 1.0 

The above command will translate the input photo into 5 caricatures with different exaggerations and styles:

examples

Pretrained Models

The pre-trained models are shared here.

Citation

If you use this code for your research, please cite our paper.

@article{gu2021carime,
title={CariMe: Unpaired Caricature Generation with Multiple Exaggerations},
author={Gu, Zheng and Dong, Chuanqi and Huo, Jing and Li, Wenbin and Gao, Yang},
journal={IEEE Transactions on Multimedia},
year={2021},
publisher={IEEE}
}

Reference

Some of our code is based on FUNIT and UGATIT.

About

Unpaired Caricature Generation with Multiple Exaggerations (TMM 2021)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages