Skip to content
Implementation for the paper "Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization"
Python
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
bnn_optimization dense_units and filters update Jul 30, 2019
.gitignore
LICENSE Create LICENSE Jun 3, 2019
README.md Update README.md (#11) Jul 14, 2019
setup.py Fix requirements to make it reproducible Jun 5, 2019

README.md

Rethinking Binarized Neural Network Optimization

arXiv:1906.02107 License: Apache 2.0 Code style: black

Implementation for paper "Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization"

Note: Bop is now added to Larq, the open source training library for BNNs. We recommend using the Larq implementation of Bop: it is compatible with more versions of TensorFlow and will be more actively maintained.

Requirements

You can also check out one of our prebuilt docker images.

Installation

This is a complete Python module. To install it in your local Python environment, cd into the folder containing setup.py and run:

pip install -e .

Train

To train a model locally, you can use the cli:

bnno train binarynet --dataset cifar10

Reproduce Paper Experiments

Hyperparameter Analysis (section 5.1)

To reproduce the runs exploring various hyperparameters, run:

bnno train binarynet \
    --dataset cifar10 \
    --preprocess-fn resize_and_flip \
    --hparams-set bop \
    --hparams threshold=1e-6,gamma=1e-3

where you use the appropriate values for threshold and gamma.

CIFAR-10 (section 5.2)

To achieve the accuracy in the paper of 91.3%, run:

bnno train binarynet \
    --dataset cifar10 \
    --preprocess-fn resize_and_flip \
    --hparams-set bop_sec52 \
    --epochs 500

ImageNet (section 5.3)

To achieve the accuracy in the paper of 54.2%, run:

bnno train birealnet --dataset imagenet2012 --hparams-set bop --epochs 100
You can’t perform that action at this time.