Skip to content

Implementations of knowledge distillation and knowledge transfer models in neural networks.

Notifications You must be signed in to change notification settings

waitwaitforget/KnowledgeSharing-Pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

KnowledgeSharing-Pytorch

This repository is maintained to implement some state-of-the-art knowledge distillation and knowledge transfer methods.

ToDo List

Knowledge Distillation (KD)

Knowledge distillation was proposed to distill knowledge from a large teacher network to a smaller student network. KD can help the student model to achieve higher generalization performance. It's applications include model compression.

Knowledge Transfer (KT)

Model List

  • Basic knowledge distillation
  • Born-again Neural Networks
  • Knowledge Transfer with Jacobian Matching
  • Deep Mutual Learning
  • Co-teaching
  • One-the-fly Native Ensemble
  • MentorNet

About

Implementations of knowledge distillation and knowledge transfer models in neural networks.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages