Pytorch implementation of various Knowledge Distillation (KD) methods.
-
Updated
Nov 25, 2021 - Python
Pytorch implementation of various Knowledge Distillation (KD) methods.
This project implements knowledge distillation from DINOv2 (Vision Transformer) to convolutional networks, enabling efficient visual representation learning with reduced computational requirements.
Add a description, image, and links to the kd-methods topic page so that developers can more easily learn about it.
To associate your repository with the kd-methods topic, visit your repo's landing page and select "manage topics."