This repository is the official implementation of LiNLNet: Gauging Required Nonlinearity in Deep Neural Networks.
The defining feature of LiNLNets is the use of linear-nonlinear units (LiNLUs) as activation functions. The LiNLUs in a layer are given a single nonlinearity parameter
To install requirements:
pip install -r requirements.txt
To train LiNL-Net on CIFAR-10 or ImageNet, run this command:
cd LiNLU
python main.py --task <CIFAR-10 or ImageNet> --network <MLP or AlexNet or VGG16 or ResNet18> --mode train --activation LiNLU
To evaluate LiNL-Net on CIFAR-10, or ImageNet, run this command:
cd LiNLU
python main.py --task <CIFAR-10 or ImageNet> --network <MLP or AlexNet or VGG16 or ResNet18> --mode eval --activation LiNLU
Our model achieves the following performance on:
Network | Dataset | Accuracy (%) | # Linear layers |
---|---|---|---|
LiNL-MLP | CIFAR-10 | 68.10 ± 0.17 | 1 |
LiNL-AlexNet | CIFAR-10 | 87.71 ± 0.08 | 2 |
LiNL-VGG16 | CIFAR-10 | 93.70 ± 0.01 | 3 |
LiNL-ResNet18 | CIFAR-10 | 92.43 ± 0.14 | 3 |
LiNL-AlexNet | ImageNet | 57.36 ± 0.13 | 1 |
LiNL-VGG16 | ImageNet | 72.58 ± 0.08 | 1 |
LiNL-ResNet18 | ImageNet | 69.43 ± 0.09 | 3 |