Skip to content

zhangqianhui/context-encoder

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Accepted at CVPR 2016
Project Website

If you find Context-Encoder useful in your research, please cite:

@inproceedings{pathakCVPR16context,
    Author = {Pathak, Deepak and Kr\"ahenb\"uhl, Philipp and Donahue, Jeff and Darrell, Trevor and Efros, Alexei},
    Title = {Context Encoders: Feature Learning by Inpainting},
    Booktitle = {Computer Vision and Pattern Recognition ({CVPR})},
    Year = {2016}
}

Contents

  1. Semantic Inpainting Demo
  2. Features Caffemodel

1) Semantic Inpainting Demo

Inpainting using context encoder trained jointly with reconstruction and adversarial loss. Currently, I have only released the demo for the center region inpainting only and will release the arbitrary region semantic inpainting models soon.

  1. Install Torch: http://torch.ch/docs/getting-started.html#_

  2. Clone the repository

git clone https://github.com/pathak22/context-encoder.git
  1. Demo
cd context-encoder
bash ./models/scripts/download_inpaintCenter_models.sh
# This will populate the `./models/` folder with trained models.

net=models/inpaintCenter/paris_inpaintCenter.t7 name=paris_result imDir=images/paris overlapPred=4 manualSeed=222 batchSize=21 gpu=1 th demo.lua
net=models/inpaintCenter/imagenet_inpaintCenter.t7 name=imagenet_result imDir=images/imagenet overlapPred=0 manualSeed=222 batchSize=21 gpu=1 th demo.lua
net=models/inpaintCenter/paris_inpaintCenter.t7 name=ucberkeley_result imDir=images/ucberkeley overlapPred=4 manualSeed=222 batchSize=4 gpu=1 th demo.lua
# Note: If you are running on cpu, use gpu=0
# Note: samples given in ./images/* are held-out images

Sample results on held-out images:

teaser

2) Features Caffemodel

Features for context encoder trained with reconstruction loss.

About

Unsupervised Feature Learning by Image Inpainting

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Lua 84.4%
  • Shell 15.6%