Workshop and Challenge on Perceptual Image Restoration and Manipulation
Switch branches/tags
Nothing to show
Clone or download
Latest commit 30294db Sep 16, 2018
Failed to load latest commit information.
self_validation_HR Update Apr 26, 2018
evaluate_results.m Update evaluate_results.m May 22, 2018

The PIRM Challenge on Perceptual Super-Resolution

The PIRM-SR Challenge will compare and rank methods for perceptual single-image super-resolution. State-of-the-art methods in terms of perceptual quality (e.g. SRGAN) are rated poorly by "simple" distortion measures such as PSNR and SSIM. Therefore, in contrast to previous challenges, the evaluation and ranking will be done in a perceptual-quality aware manner based on [Blau and Michaeli, CVPR'18]. This unified approach quantifies the accuracy and perceptual quality of algorithms jointly, and will enable perceptual-driven methods to compete alongside algorithms that target PSNR maximization.

For further details see the challenge website.

Self-validation Code

This Matlab code computes the RMSE and perceptual scores for your method's outputs on the self-validation set.

Quick Start

  1. Copy your outputs into the your_results folder in the base directory.
  2. Copy the validation/test set (HR images only) into the self_validation_HR folder.
  3. Download the Ma et al. code, and extract it into the utils/sr-metric-master folder.
  4. Run the evaluate_results.m script.


Depending on your operating system, you may need the recompile the MEX files in the matlabPyrTools toolbox. If so:

  1. Run utils/sr-metric-master/external/matlabPyrTools/MEX/compilePyrTools.m
  2. Copy the generated MEX files into the parent directory utils/sr-metric-master/external/matlabPyrTools

Note: in Linux or OS you should also change line 82 in mex_regressionRF_predict.cpp to: plhs[0]=mxCreateNumericMatrix(n_size,1,mxDOUBLE_CLASS,mxREAL); Pre-compiled mex files (for OS, Linux and Win) are also available at this link (Thank you Muhammad Haris for the solution).


This code is distributed only for academic research purposes only.
For other purposes, please contact Roey Mechrez: roey (at)