Implementation of Learning Proximal Operators: Using Denoising Networks for Regularizing Inverse Imaging Problems
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.

Learning Proximal Operators

This repository provides the implementation of our paper Learning Proximal Operators: Using Denoising Networks for Regularizing Inverse Imaging Problems (Tim Meinhardt, Michael Möller, Caner Hazirbas, Daniel Cremers, ICCV 2017) []. All results presented in our work were produced with this code.

Additionally we provide a TensorFlow implementation of the denoising convolutional neural network (DNCNN) introduced in Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising [].


  1. Install git-lfs for pulling the model and data files with the repository
  2. git clone
  3. Install the following packages for Python 3.6:
    1. pip3 install -r requirements.txt
    2. ProxImaL: pip3 install git+
    3. PyBM3D:
      1. with CUDA: pip3 install git+
      2. without CUDA: pip3 install git+
    4. TensorFlow:
      1. with CUDA: pip3 install tensorflow-gpu==1.3.0
      2. without CUDA: pip3 install tensorflow==1.3.0
    5. OpenCV:
      1. pip3 install opencv-python==
      2. or for faster NLM denoising compile OpenCV 3.3.0 manually with CUDA support and Python 3.6 bindings
  4. Download the demosaicking (McMaster and Kodak) and the greyscale deblurring datasets with data/
  5. (Optional, for faster computation and training DNCNN models) Install CUDA and set the CUDA_HOME environment variable.
  6. (Optional, for optimal results and faster computation) Install Halide and set the HALIDE_PATH environment variable.

Run an Experiment

The evaluation of our method included two exemplary linear inverse problems, namely Bayer color demosaicking and grayscale deblurring. In order to configure, organize, log and reproduce our computational experiments we structured the problems with the Sacred framework.

For a detailed explanation on a typical Sacred interface please read its documentation. We implemented two Sacred ingredients (elemental_ingredient, grid_ingredient) which are both injected into our experiments. Among other things each of the experiments consists of multiple command line executable Sacred commands.

If everything is setup correctly the print_config command for example prints the current configuration scope by executing:

python src/ print_config

A typical run with a preset configuration scope for optimal DNCNN parameters is executed with (automain command):

python src/ with experiment_name=experiment_a image_name=barbara elemental.optimal_DNCNN_experiment_a

Hyperparameter Grid Search

We conducted multiple exhaustive grid searches to establish the optimal hyper parameters for both experiments. The set of searchable grid_params has to be set in the respective experiment file. A search for the optimal demosaicking parameters for all images and the BM3D denoising prior is started by executing:

python src/ grid_search_all_images with elemental.denoising_prior=BM3D

The grid_search.param_dicts_file_path configuration parameter can be used to continue a previous search.

Training a DNCNN

The training of the denoising convolutional neural network which we applied as a learned denoising prior was implemented with TensorFlow. With the help of command line we provide full control over the training procedure. The single channel model provided with this repository was trained by executing:

python src/ --sigma_noise 0.02 --batch_size 128 --network DNCNN --channels 1 --pipeline bsds500 --device_name /gpu:0 --train_epochs 100


If you use this software in your research, please cite our publication:

    author    = {Tim Meinhardt and
                 Michael Moeller and
                 Caner Hazirbas and
                 Daniel Cremers},
    title     = {Learning Proximal Operators: Using Denoising Networks for Regularizing Inverse Imaging Problems},
    journal   = {CoRR},
    volume    = {abs/1704.03488},
    year      = {2017},
    url       = {},
    timestamp = {Wed, 07 Jun 2017 14:40:59 +0200},
    biburl    = {},
    bibsource = {dblp computer science bibliography,}}