Deep Learning Head Pose Estimation using PyTorch
-
Updated
May 13, 2022 - Python
Deep Learning Head Pose Estimation using PyTorch
Transfer Learning for Neural Topic Models using Knowledge Distillation
Multi-class image classification of Distilling Knowledge by Mimicking Features.
VisionTransformer for Tensorflow2
This is the repository for implementation of 'Knowledge Distillation for Multi-task Learning'
Implementation of several neural network compression techniques (knowledge distillation, pruning, quantization, factorization), in Haiku.
Official code for the paper "Domain Generalization for Crop Segmentation with Knowledge Distillation"
Knowledge distillation pytorch lightning template for image classification task
Improving Question Answering Performance Using Knowledge Distillation and Active Learning
Tensorflow based framework for 3D-Unet with Knowledge Distillation
Official implementation of "Intra-Class Similarity-Guided Feature Distillation" accepted in NeurIPS-ENLSP 2023
Comparing performance of a small transformer model with and without Knowledge Distillation
Code for the Knowledge distillation work to enhance fine grained disease recognition.
Deep learning based color transfer between images.
Official implementation of AAAI22 paper "ADD: Frequency Attention and Multi-View based Knowledge Distillation to Detect Low-Quality Compressed Deepfake Images"
cViL: Cross-Lingual Training of Vision-Language Models using Knowledge Distillation
Code for "Language Model Knowledge Distillation for Efficient Question Answering in Spanish" (ICLR 2024 Tiny Papers)
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."