Skip to content
master
Switch branches/tags
Go to file
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.

This repository is for our paper "Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression"

A method for distributing blockwise compression of models accross many workers using Tensorflow and MPI

Overview Image

many of the notebooks depend on the pretrained vgg16 and resnet50 fine-tuned on upscaled cifar10. For size reasons .h5 files are not tracked on this repo. If cloning you should download the .h5 files from google drive at the following links

bash scripts are contained in is repo to build the docker image, start the docker container and start the jupyterlab instance needed for this project.

bash scripts need to be run with sudo permissions.

About

No description, website, or topics provided.

Resources

Releases

No releases published

Packages

No packages published