Skip to content

Tensorflow implementation of "Distilling a Neural Network Into a Soft Decision Tree" https://arxiv.org/pdf/1711.09784.pdf

Notifications You must be signed in to change notification settings

DoMy91/DistillingNN-SDT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DistillingNN-SDT

I achieved 95.47% of accuracy on test set (MNIST dataset) on a 9-level tree after 23 epoches of training (without distilling) with hyperparameters reported on top of main.py (learned best weights in best_ckpt folder).

alt text

Details

About

Tensorflow implementation of "Distilling a Neural Network Into a Soft Decision Tree" https://arxiv.org/pdf/1711.09784.pdf

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages