Skip to content

BillAmihom/RAPQ

Repository files navigation

RAPQ: Rescuing Accuracy for Power-of-Two Low-bit Post-training Quantization.

Pytorch implementation of RAPQ, IJCAI 2022. link

Contact

Any question can be asked, please contact with E-mail: 813767017@qq.com

Notice

  • RAPQ provides the Power-of-Two quantization scheme for PTQ specially. Because of BRECQ's SOTA performance in PTQ area, this hub implements RAPQ based on BRECQ by Yuhang Li @yhhhli. .

  • Please download the pretrained models before running this program!

Getting start

1. Download pretrained models.(Thanks for pretrained models provided by @yhhhli ! )

resnet18

resnet50

mobilenetv2

regnetx_600m

regnetx_3200m

After downloading, please put it into "~/.cache/torch/checkpoints" of your user path

2. Prepare the Environment

This program is done in the Pytorch framework, so please prepare the environment first!

3. Prepare the Dataset

ImageNet dataset is also Necessary!

4. All ready,GO!

  • Use Naive Powers-of-Two PTQ:

CUDA_VISIBLE_DEVICES=0 python main_imagenet.py --data_path /path/to/ImageNet/ --arch mobilenetv2 --n_bits_w 2 --channel_wise --n_bits_a 4 --act_quant --test_before_calibration

  • Use RAPQ Quick Mode:

CUDA_VISIBLE_DEVICES=0 python main_imagenet.py --RAPQ --data_path /path/to/ImageNet/ --arch resnet18 --n_bits_w 2 --channel_wise --n_bits_a 4 --act_quant --test_before_calibration

  • Use RAPQ:

CUDA_VISIBLE_DEVICES=0 python main_imagenet.py --RAPQ --data_path /path/to/ImageNet/ --arch mobilenetv2 --n_bits_w 2 --iters_w 80000 --channel_wise --n_bits_a 4 --act_quant --test_before_calibration

LICENSE

RAPQ is release under MIT license.

About

Pytorch implementation of RAPQ, IJCAI 2022

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages