DINOv1 implementation in Pytorch
-
Updated
Jun 30, 2024 - Python
DINOv1 implementation in Pytorch
The Biorefinery Simulation and Techno-Economic Analysis Modules; Life Cycle Assessment; Chemical Process Simulation Under Uncertainty
PaddleSlim is an open-source library for deep model compression and architecture search.
Official implementation of ⚡ Flash Diffusion ⚡: Accelerating Any Conditional Diffusion Model for Few Steps Image Generation
[CVPRW 2021] Rethinking Ensemble-Distillation for Semantic Segmentation Based Unsupervised Domain Adaptation
Distibot (DISTIller roBOT) is a Python program for Raspberry Pi (Raspbian) to control a whole process of distillation
[CVPR 2024] Asymmetric Masked Distillation for Pre-Training Small Foundation Models
The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal Distillation for BEV 3D Object Detection""
irresponsible innovation. Try now at https://chat.dev/
Insightface Keras implementation
Pytorch implementation of ACCV18 paper "Revisiting Distillation and Incremental Classifier Learning."
[ICLR 2022] Code for Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation (GLNN)
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
[AAAI 2024] MESED: A Multi-modal Entity Set Expansion Dataset with Fine-grained Semantic Classes and Hard Negative Entities
(Interspeech 2023 & ICASSP 2024) Official repository for ARMHuBERT and STaRHuBERT
Python code to implement LLM4Teach, a policy distillation approach for teaching reinforcement learning agents with Large Language Model
Code for "On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models"
PyTorch implementation of various distillation approaches for continual learning of Diffusion Models.
🤖[MICCAI 2023] The official repository for paper "L3DMC: Lifelong Learning using Distillation via Mixed-Curvature Space"
Official Code Base for ICLR 2024 paper Enhancing Tail Performance in Extreme Classifiers by Label Variance Reduction
Add a description, image, and links to the distillation topic page so that developers can more easily learn about it.
To associate your repository with the distillation topic, visit your repo's landing page and select "manage topics."