Awesome Knowledge Distillation
-
Updated
Mar 11, 2025
Awesome Knowledge Distillation
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.
irresponsible innovation. Try now at https://chat.dev/
[ICML2023] Revisiting Data-Free Knowledge Distillation with Poisoned Teachers
[ICML 2024]Exploration and Anti-exploration with Distributional Random Network Distillation
An R package providing functions for interpreting and distilling machine learning models
Distillation Knowledge for training Multi-exit Model
Easily generate synthetic data for classification tasks using LLMs
It is envisaged to eliminate these light constituents by distillation (flash or stripping). A preliminary study of the operating conditions of the process can be done in pseudo-binary: we assimilate the C7 cut to n-heptane and the light ones to ethane. We wish to construct the diagrams [T-x-y] and [x-y], [h-x-y] of the ethane-n-heptane binary u…
A transformer-based masked language model for learning amino acid sequence representations. The model uses self-attention mechanisms with custom gating and incorporates protein features for enhanced sequence understanding. Trained using BERT-style masking on peptide sequences to learn contextual amino acid embeddings.
Code Reproduction of the essay Distillation Decision Tree
This repository provides a combination of the bubble-point algorithm and Naphtali-Sandholm algorithm to steadily compute distillation separations with partial condensers
Improving Typhoon Center Location Models by Augmented Typhoon Image and Distillation Methods
CISPA Summer Internship
Step-by-step tutorial how to embed content to GPT-4 using Azure Webservices
Add a description, image, and links to the distillation-model topic page so that developers can more easily learn about it.
To associate your repository with the distillation-model topic, visit your repo's landing page and select "manage topics."