Skip to content

harveyp123/AutoReP

Repository files navigation

Official implementation of "AutoReP: Automatic ReLU Replacement for Fast Private Network Inference"

Please cite our paper if you use the code ✔

@inproceedings{peng2023autorep,
  title={AutoReP: Automatic ReLU Replacement for Fast Private Network Inference},
  author={Peng, Hongwu and Huang, Shaoyi and Zhou, Tong and others},
  booktitle={Proceedings of the 2023 International Conference on Computer Vision(ICCV)},
  year={2023}
}

Overview of the AutoReP Framework

We propose AutoReP Framework for Automatic ReLU Replacement for multi-party computation based encrypted inference accelerating.

  • Parameterized discrete indicator function, we introduce a parameterized discrete indicator function co-trained with model weights until convergence. Our approach allows for fine-grained selection of ReLU and polynomial functions at the pixel level, resulting in a more optimized and efficient model.

  • Hysteresis loop, we present a hysteresis loop update function to enhance the stability of the binarized ReLU replacement training process, which enables a recoverable and stable replacement and leads to better convergence and higher accuracy.

  • Distribution-aware polynomial approximation (DaPa) offers a novel solution to the problem of accurately approximating ReLUs using polynomial functions under specific feature distributions. By minimizing the structural difference between the original and replaced networks and maintaining high model expressivity.

ReLU Reduction/ ReLU Peplacement

Some steps to setup the environment and download dataset

# Create a environment
conda create --name AutoReP
#or
conda create --prefix=${HOME}/.conda/envs/AutoReP python=3.9
# Then activate the environment
conda activate AutoReP
# Install pytorch package
conda install -y pytorch==1.12.0 torchvision==0.13.0 torchaudio==0.12.0 cudatoolkit=11.6 -c pytorch -c conda-forge
# Install tensorboard to record accuracy/loss stuffs
conda install -c conda-forge tensorboardx
pip install tqdm pytorch_warmup
pip install scipy

Download Tiny-ImageNet dataset:

bash dataset_download/download_tinyimagenet.sh

1. Train a baseline models

Ways to repeat the pretrained model experiment:

bash scripts/scripts_baseline.sh
  • You should specify "--act_type nn.ReLU" to run the baseline model by using ReLU non-linear function.
  • You can speicify which gpu you will be used by changing "--gpu 0". In the scripts, "nohup python > out.log" put the execution of python program into background, and direct the command line output to out.log.
  • You may need to change the dataset path
  • Model and logging path can be found in: ./train_cifar/wide_resnet_22_8__tiny_imagenet

Add Hysteresis Loop into Gated Mask function:

The forward part of gated mask is a hysteresis loop function rather than simple gated mask as f(x) = x > 0. The backward follows the same STE function as gated mask backward.

The hysteresis function looks like this:

Alt text

The hysteresis function can be described as:

def Hysteresis(now_state, in_val, threshold):
    if now_state == 1:
        if in_val < (-1) * threshold:
            now_state = 0
    else:
        if in_val > threshold:
            now_state = 1
    return now_state

The threshold is a hyper-parameter to adjust the width of hysteresis loop.

2. Run AutoReP with DaPa based Polynomial function (c2x2 + c1x + c0) with hysteresis loop:

Here are the steps to run the AutoReP with proposed function for WideResNet-22-8 on Tiny-ImageNet dataset for 150K ReLU budget:

bash scripts/wide_resnet_22_8__tiny_imagenet_dapa2_distil_split_lr_final.sh
  • You can change the number of epochs to get a higher accuracy but with a longer training time.
  • Model and logging path can be found in: ./train_cifar_dapa2_distil_relay/wide_resnet_22_8_wide_resnet_22_8_tiny_imagenet_relay_0.003/cosine_ReLUs150.0wm_lr0.001mep80_baseline