Skip to content
Learning warped guidance for blind face restoration (ECCV 2018)
Branch: master
Clone or download
Latest commit 14e5996 Jul 12, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
Datasets Add files via upload Apr 12, 2018
data2 Add files via upload Apr 12, 2018
imgs Add files via upload Apr 23, 2018
util Add files via upload Apr 12, 2018
README.md Update README.md Jul 12, 2019
test.lua Update test.lua Apr 13, 2018

README.md

GFRNet

Torch implementation for Learning Warped Guidance for Blind Face Restoration

GFRNet framework

Overview of our GFRNet. The WarpNet takes the degraded observation and guided image as input to predict the dense flow field, which is adopted to deform guided image to the warped guidance. Warped guidance is expected to be spatially well aligned with ground-truth. Thus the RecNet takes warped guidance and degradated observation as input to produce the restoration result.

Training

Pytorch implementation of WarpNet (the first subnet of GFRNet) can be found here.

Testing

th test.lua

Models

Download the pre-trained model with the following url and put it into ./checkpoints/FaceRestoration/.

Results

Restoration on real low quality images

The first row is real low quality image(close-up in right bottom is the guided image). The second row is GFRNet result.

Warped guidance

IMDB results

The content marked with green box is the restoration results by our GFRNet. All of these images are collected from Internet Movie Database (IMDb).

 
InputGuided ImageBicubicGFRNet Results

Requirements and Dependencies

Acknowledgments

Code borrows heavily from pix2pix. Thanks for their excellent work!

Citation

@InProceedings{Li_2018_ECCV,
author = {Li, Xiaoming and Liu, Ming and Ye, Yuting and Zuo, Wangmeng and Lin, Liang and Yang, Ruigang},
title = {Learning Warped Guidance for Blind Face Restoration},
booktitle = {The European Conference on Computer Vision (ECCV)},
month = {September},
year = {2018}
}
You can’t perform that action at this time.