I achieved 95.47% of accuracy on test set (MNIST dataset) on a 9-level tree after 23 epoches of training (without distilling) with hyperparameters reported on top of main.py (learned best weights in best_ckpt folder).
-
Notifications
You must be signed in to change notification settings - Fork 1
Tensorflow implementation of "Distilling a Neural Network Into a Soft Decision Tree" https://arxiv.org/pdf/1711.09784.pdf
DoMy91/DistillingNN-SDT
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Tensorflow implementation of "Distilling a Neural Network Into a Soft Decision Tree" https://arxiv.org/pdf/1711.09784.pdf
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published