Skip to content
Successfully training approximations to full-rank matrices for efficiency in deep learning.
Python
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
images
models
.gitignore
README.md arxiv citation completed Jun 4, 2019
collate_results.py collate from different machines Feb 9, 2019
count.py notes on new mobilenetv2 experiment Jul 29, 2019
darts_experiments.py
funcs.py
history.py useful to know final val error Feb 1, 2019
imagenet_experiments.py
load_wrn50_2.py can't find reason for difference in reported top 1 error Nov 23, 2018
main.py
research-log.md notes on new mobilenetv2 experiment Jul 29, 2019
wrn_experiments.py

README.md

Deficient Linear Transforms for Efficient Deep Learning

Substitute compressed linear transforms for deep learning. Substitute convolutions into an existing WideResNet or DARTS network and train as normal. Details of the research are provided in the research log.

tl;dr

In a deep neural network, you can replace the matrix multiply using a weight matrix (a linear transform) with an alternative that uses fewer parameters or mult-adds or both. Such as:

But, this will only train if you scale the original weight decay used to train the network by the compression ratio.

WRN-28-10 on CIFAR-10

DARTS on CIFAR-10

WRN-50-2 on ImageNet

Citations

If you would like to cite this work, please cite our paper using the following bibtex entry:

@article{gray2019separable,
  author    = {Gavin Gray and
               Elliot J. Crowley and
               Amos Storkey},
  title     = {Separable Layers Enable Structured Efficient Linear
Substitutions},
  journal   = {CoRR},
  volume    = {abs/1906.00859}, pending
  year      = {2019},
  url       = {https://arxiv.org/abs/1906.00859},
  archivePrefix = {arXiv}
  eprint    = {1906.00859}
}

Acknowledgements

Based on: https://github.com/BayesWatch/pytorch-moonshine

You can’t perform that action at this time.