Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time

Learning Proximal Operators

This repository provides the implementation of our paper Learning Proximal Operators: Using Denoising Networks for Regularizing Inverse Imaging Problems (Tim Meinhardt, Michael Möller, Caner Hazirbas, Daniel Cremers, ICCV 2017) []. All results presented in our work were produced with this code.

Additionally we provide a TensorFlow implementation of the denoising convolutional neural network (DNCNN) introduced in Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising [].


  1. Install git-lfs for pulling the model and data files with the repository
  2. git clone
  3. Install the following packages for Python 3.6:
    1. pip3 install -r requirements.txt
    2. ProxImaL: pip3 install git+
    3. PyBM3D:
      1. with CUDA: pip3 install git+
      2. without CUDA: pip3 install git+
    4. TensorFlow:
      1. with CUDA: pip3 install tensorflow-gpu==1.3.0
      2. without CUDA: pip3 install tensorflow==1.3.0
    5. OpenCV:
      1. pip3 install opencv-python==
      2. or for faster NLM denoising compile OpenCV 3.3.0 manually with CUDA support and Python 3.6 bindings
  4. Download the demosaicking (McMaster and Kodak) and the greyscale deblurring datasets with data/
  5. (Optional, for faster computation and training DNCNN models) Install CUDA and set the CUDA_HOME environment variable.
  6. (Optional, for reproducing paper results and faster computation) Install Halide (Version: 2016/04/27) and set the HALIDE_PATH environment variable.

Run an Experiment

The evaluation of our method included two exemplary linear inverse problems, namely Bayer color demosaicking and grayscale deblurring. In order to configure, organize, log and reproduce our computational experiments we structured the problems with the Sacred framework.

For a detailed explanation on a typical Sacred interface please read its documentation. We implemented two Sacred ingredients (elemental_ingredient, grid_ingredient) which are both injected into our experiments. Among other things each of the experiments consists of multiple command line executable Sacred commands.

If everything is setup correctly the print_config command for example prints the current configuration scope by executing:

python src/ print_config

A typical run with a preset configuration scope for optimal DNCNN parameters is executed with (automain command):

python src/ with experiment_name=experiment_a image_name=barbara elemental.optimal_DNCNN_experiment_a

Hyperparameter Grid Search

We conducted multiple exhaustive grid searches to establish the optimal hyper parameters for both experiments. The set of searchable grid_params has to be set in the respective experiment file. A search for the optimal demosaicking parameters for all images and the BM3D denoising prior is started by executing:

python src/ grid_search_all_images with elemental.denoising_prior=BM3D

The grid_search.param_dicts_file_path configuration parameter can be used to continue a previous search.

Training a DNCNN

The training of the denoising convolutional neural network which we applied as a learned denoising prior was implemented with TensorFlow. With the help of command line we provide full control over the training procedure. The single channel model provided with this repository was trained by executing:

python src/ --sigma_noise 0.02 --batch_size 128 --network DNCNN --channels 1 --pipeline bsds500 --device_name /gpu:0 --train_epochs 100


If you use this software in your research, please cite our publication:

    author={T. {Meinhardt} and M. {Moeller} and C. {Hazirbas} and D. {Cremers}},
    booktitle={2017 IEEE International Conference on Computer Vision (ICCV)},
    title={Learning Proximal Operators: Using Denoising Networks for Regularizing Inverse Imaging Problems},
    keywords={deconvolution;image denoising;image restoration;image segmentation;inverse problems;learning (artificial intelligence);minimisation;inverse imaging problems;linear inverse problems;deep neural networks;convolutional;convex energy minimization algorithms;fixed denoising neural network;image deconvolution;image demosaicking;deep learning;data fidelity;blur kernels;Noise reduction;Mathematical model;Neural networks;Hafnium;Gold;Training;Algorithm design and analysis},


Implementation of "Learning Proximal Operators: Using Denoising Networks for Regularizing Inverse Imaging Problems"








No releases published


No packages published