Use HugginFace Transfromers and Pracetice Knowledge distillation, quantization, ONNX, ORT
-
Updated
Feb 10, 2023 - Jupyter Notebook
Use HugginFace Transfromers and Pracetice Knowledge distillation, quantization, ONNX, ORT
Deep Learning Head Pose Estimation using PyTorch
Transfer Learning for Neural Topic Models using Knowledge Distillation
Multi-class image classification of Distilling Knowledge by Mimicking Features.
VisionTransformer for Tensorflow2
Tensorflow based framework for 3D-Unet with Knowledge Distillation
This is the repository for implementation of 'Knowledge Distillation for Multi-task Learning'
Implementation of several neural network compression techniques (knowledge distillation, pruning, quantization, factorization), in Haiku.
Official code for the paper "Domain Generalization for Crop Segmentation with Knowledge Distillation"
Knowledge distillation pytorch lightning template for image classification task
Improving Question Answering Performance Using Knowledge Distillation and Active Learning
This repository includes some detailed proofs of "Bias Variance Decomposition for KL Divergence".
Comparing performance of a small transformer model with and without Knowledge Distillation
This is all you need for NLP transformer training and knowledge distilation
Official implementation of "Intra-Class Similarity-Guided Feature Distillation" accepted in NeurIPS-ENLSP 2023
Knowledge distillation for masked FER using ResNet-18 in PyTorch.
code for our paper "Incremental Meta-Learning via Episodic Replay Distillation for Few-Shot Image Recognition" in CVPR 2022 3rd CLVISION continual learning workshop
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."