Skip to content
master
Switch branches/tags
Go to file
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 

README.md

Bit-slice Sparsity

This repo contains the codes for our preliminary study [paper][poster][presentation] which aims at improving bit-slice sparsity for efficient ReRAM deployment of DNN. Codes are tested with Pytorch 1.2.0 and Python 3.7.

The codes for MNIST and CIFAR-10 are within mnist/ and cifar/ respectively. The training routine mainly consists of three parts: pre-training, pruning, and fine-tuning.

First, pre-train a fixed-point model:

python pretrain.py

Then, load and prune the pre-trained model, and fine-tune with either normal l1 regularization, or bit-slice l1 regularization.

python finetune_l1.py or python finetune_bitslice.py

There are some arguments within the codes for which we have set up default values, but you may want to check it yourself and make some adjustments.

Acknowledgement

The codes are adapted from nics_fix_pytorch.

About

Codes for our paper "Exploring Bit-Slice Sparsity in Deep Neural Networks for Efficient ReRAM-Based Deployment" [NeurIPS'19 EMC2 workshop].

Resources

Releases

No releases published

Packages

No packages published

Languages