Pytorch implementation of various Knowledge Distillation (KD) methods.
-
Updated
Nov 25, 2021 - Python
Pytorch implementation of various Knowledge Distillation (KD) methods.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
PyContinual (An Easy and Extendible Framework for Continual Learning)
Code and dataset for ACL2018 paper "Exploiting Document Knowledge for Aspect-level Sentiment Classification"
Code and pretrained models for paper: Data-Free Adversarial Distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
An Extensible Continual Learning Framework Focused on Language Models (LMs)
PyTorch implementation of (Hinton) Knowledge Distillation and a base class for simple implementation of other distillation methods.
Code for ECML/PKDD 2020 Paper --- Continual Learning with Knowledge Transfer for Sentiment Classification
[Paper][AAAI 2023] DUET: Cross-modal Semantic Grounding for Contrastive Zero-shot Learning
[ECCV2022] Factorizing Knowledge in Neural Networks
Code for NeurIPS 2020 Paper --- Continual Learning of a Mixed Sequence of Similar and Dissimilar Tasks
Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)
[NeurIPS'23] Source code of "Data-Centric Learning from Unlabeled Graphs with Diffusion Model": A data-centric transfer learning framework with diffusion model on graphs.
The idea is to use pretrained deep learning algorithms such as mobilenet to predict objects in a image.
Multiple methods' implementations to transfer the knowledge between Neural Networks and save/plot/compare the results.
Implementation of NAACL 2024 main conference paper: Named Entity Recognition Under Domain Shift via Metric Learning for Life Science
Add a description, image, and links to the knowledge-transfer topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-transfer topic, visit your repo's landing page and select "manage topics."