A template for use in creating Autodistill Target Model packages.
-
Updated
Jun 20, 2024 - Python
A template for use in creating Autodistill Target Model packages.
PyTorch implementation of various distillation approaches for continual learning of Diffusion Models.
Using Knowledge Graph to Query Resume
Distillation and some other iterative methods for fastText.
This is a fork of the distilling-step-by-step repository with the aim of creating a task-specific LLM distillation framework for healthcare.
code for our paper DistilALHuBERT: A Distilled Parameter Sharing Audio Representation Model
A list of papers, docs, codes about diffusion distillation.This repo collects various distillation methods for the Diffusion model. Welcome to PR the works (papers, repositories) missed by the repo.
MATLAB program that models a binary flash distillation column by calculating vapor-liquid equilibrium using the Antoine equation. Determines liquid and vapor product flow rates, compositions, temperatures based on given feed conditions like pressure, temperature, composition. Plots a T-x-y diagram.
Static and animated distillation phase diagrams for chemistry education
VisionTransformer for Tensorflow2
A tutorial on how to prune the embedding layer of a language model and crafting a suitable tokenizer
Chemical Engineering application: Distillation calculator for McCabe-Thiele and Ponchon-Savarit methods. https://apguilherme.github.io/Distillation/
Computer Science 791-025: Real-Time AI & High-Performance Machine Learning
Efficient Inference techniques implemented in PyTorch for computer vision.
[ICCV 2019] A Comprehensive Overhaul of Feature Distillation
Implementation code of GKD: Semi-supervised Graph Knowledge Distillation for Graph-Independent Inference accepted by Medical Image Computing and Computer Assisted Interventions (MICCAI 2021)
🤖[MICCAI 2023] The official repository for paper "L3DMC: Lifelong Learning using Distillation via Mixed-Curvature Space"
Prompt engineering for developers
Lightweight knowledge distillation pipeline
The official implementation of MeDQN algorithm.
Add a description, image, and links to the distillation topic page so that developers can more easily learn about it.
To associate your repository with the distillation topic, visit your repo's landing page and select "manage topics."