Skip to content

PyTorch implementation of "Distilling the Knowledge in a Neural Network" for model compression

Notifications You must be signed in to change notification settings

thaonguyen19/ModelDistillation-PyTorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 

Repository files navigation

ModelDistillation-PyTorch

Attempt at transfering knowledge from ResNet18 to tiny ResNet architectures, following the training technique as described in "Distilling the Knowledge in a Neural Network" by Hinton et al. (https://arxiv.org/pdf/1503.02531.pdf), applied on ImageNet dataset.

About

PyTorch implementation of "Distilling the Knowledge in a Neural Network" for model compression

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages