Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time
July 15, 2021 12:29
July 15, 2021 17:04
July 15, 2021 15:09
July 15, 2021 15:04
March 2, 2022 14:38
July 15, 2021 16:54
March 2, 2022 14:27
July 15, 2021 12:29

COAST: COntrollable Arbitrary-Sampling NeTwork for Compressive Sensing [PyTorch]

This repository is for COAST introduced in the following paper

Di You, Jian Zhang, Jingfen Xie, Bin Chen, and Siwei Ma. COAST: Controllable arbitrary-sampling network for compressive sensing. IEEE Transactions on Image Processing, 2021. [pdf]

The code is built on PyTorch and tested on Ubuntu 16.04/18.04 and Windows 10 environment (Python3.x, PyTorch>=0.4) with 1080Ti GPU.


Recent deep network-based compressive sensing (CS) methods have achieved great success. However, most of them regard different sampling matrices as different independent tasks and need to train a specific model for each target sampling matrix. Such practices give rise to inefficiency in computing and suffer from poor generalization ability. In this paper, we propose a novel COntrollable Arbitrary-Sampling neTwork, dubbed COAST, to solve CS problems of arbitrary-sampling matrices (including unseen sampling matrices) with one single model. Under the optimization-inspired deep unfolding framework, our COAST exhibits good interpretability. In COAST, a random projection augmentation (RPA) strategy is proposed to promote the training diversity in the sampling space to enable arbitrary sampling, and a controllable proximal mapping module (CPMM) and a plug-and-play deblocking (PnP-D) strategy are further developed to dynamically modulate the network features and effectively eliminate the blocking artifacts, respectively. Extensive experiments on widely used benchmark datasets demonstrate that our proposed COAST is not only able to handle arbitrary sampling matrices with one single model but also to achieve state-of-the-art performance with fast speed.


Figure 1. Illustration of the proposed COAST framework.


  1. Test-CS
  2. Results
  3. Citation
  4. Acknowledgements


Quick start

  1. All models for our paper have been put in './model'.

  2. Please download sampling matrices from BaiduPan [code: rgd9].

  3. Run the following scripts to test COAST model.

    You can use scripts in file '' to produce results for our paper.

    # test scripts
    python  --cs_ratio 10 --layer_num 20
    python  --cs_ratio 20 --layer_num 20
    python  --cs_ratio 30 --layer_num 20
    python  --cs_ratio 40 --layer_num 20
    python  --cs_ratio 50 --layer_num 20

The whole test pipeline

  1. Prepare test data.

    The original test set11 is in './data'

  2. Run the test scripts.

    See Quick start

  3. Check the results in './result'.


Prepare training data

  1. Trainding data (Training_Data.mat including 88912 image blocks) is in './data'. If not, please download it from GoogleDrive or BaiduPan [code: xy52].

  2. Place Training_Data.mat in './data' directory

Begin to train

  1. run the following scripts to train ISTA-Net models.

    You can use scripts in file '' to train models for our paper.

    # train scripts
    python  --cs_ratio 10 --layer_num 20
    python  --cs_ratio 20 --layer_num 20
    python  --cs_ratio 30 --layer_num 20
    python  --cs_ratio 40 --layer_num 20
    python  --cs_ratio 50 --layer_num 20


Quantitative Results


Visual Results



If you find the code helpful in your resarch or work, please cite the following papers.

  title={COAST: COntrollable Arbitrary-Sampling NeTwork for Compressive Sensing},
  author={You, Di and Zhang, Jian and Xie, Jingfen and Chen, Bin and Ma, Siwei},
  journal={IEEE Transactions on Image Processing},



COAST: COntrollable Arbitrary-Sampling NeTwork for Compressive Sensing, TIP2021 [PyTorch Code]






No releases published


No packages published