A template for use in creating Autodistill Target Model packages.
-
Updated
Jun 20, 2024 - Python
A template for use in creating Autodistill Target Model packages.
This is a fork of the distilling-step-by-step repository with the aim of creating a task-specific LLM distillation framework for healthcare.
summer internship project @ JetBrains Research
Efficient Inference techniques implemented in PyTorch for computer vision.
Distillation examples. Trying to make Speaker Recognition Faster through different Model Compression techniques
Distillation of GANs with fairness constraints
A PyTorch-based knowledge distillation toolkit for natural language processing
This is an implementation for paper Automated training of location-specific edge models for traffic counting
Learn about making a smaller network as good as a big ensemble model that can accelarate inference time.
Deep Mutual Learning in PaddlePaddle
【NCA】Learning Metric Space with Distillation for Large-Scale Multi-Label Text Classification
DINOv1 implementation in Pytorch
Data-Augmentation-Guided-Knowledge-Distillation-for-Environmental-Sound-Classification
Sub-Band-Guided-Knowledge Distillation-for-Sound-Classification
Model Distillation for Unlabeled and Imbalanced Data for Amino-Acid-Strings
Implementation of several variations of the iCaRL incremental learning algorithm in PyTorch.
This repository contains the implementation of three adversarial example attacks including FGSM, noise, semantic attack and a defensive distillation approach to defense against the FGSM attack.
VisionTransformer for Tensorflow2
Add a description, image, and links to the distillation topic page so that developers can more easily learn about it.
To associate your repository with the distillation topic, visit your repo's landing page and select "manage topics."