knowledge-distillation
Here are 360 public repositories matching this topic...
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
-
Updated
Jun 5, 2024 - Python
Object-Completion Tools for X-Ray Distillation framework
-
Updated
Jun 3, 2024 - Python
OpenMMLab Model Compression Toolbox and Benchmark.
-
Updated
Jun 3, 2024 - Python
Decoupled Kullback-Leibler Divergence Loss (DKL)
-
Updated
Jun 2, 2024 - Python
A curated list for Efficient Large Language Models
-
Updated
May 31, 2024 - Python
The code and dataset of paper *Multi-View Fusion and Distillation for Subgrade Distresses Detection based on 3D-GPR*
-
Updated
May 30, 2024 - Python
A treasure chest for visual classification and recognition powered by PaddlePaddle
-
Updated
Jun 4, 2024 - Python
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
-
Updated
May 28, 2024 - Python
The implementation code of our paper "Learning Generalizable Models for Vehicle Routing Problems via Knowledge Distillation", accepted at NeurIPS2022.
-
Updated
May 27, 2024 - Python
[CVPR 2024] Official PyTorch Code for "PromptKD: Unsupervised Prompt Distillation for Vision-Language Models"
-
Updated
May 25, 2024 - Python
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
-
Updated
May 24, 2024 - Python
[arXiv'24] The official implementation code of LLM-ESR.
-
Updated
May 23, 2024 - Python
Full Wiki enables seamless access to Wikipedia content in multiple languages. It translates English Wikipedia the most comprensive knowledge base into other languages. The user do not need to know the translated search term. This project should be a concept of how LLMs will tear down language barriers.
-
Updated
May 22, 2024 - Python
[AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"
-
Updated
May 21, 2024 - Python
Code for CVPR'24 Paper: Segment Any Event Streams via Weighted Adaptation of Pivotal Tokens
-
Updated
May 20, 2024 - Python
[CVPR 2024] Source code for "Diffusion-Based Adaptation for Classification of Unknown Degraded Images".
-
Updated
May 15, 2024 - Python
-
Updated
May 13, 2024 - Python
An implementation of the KAN architecture using learnable activation functions for knowledge distillation on the MNIST handwritten digits dataset. The project demonstrates distilling a three-layer teacher KAN model into a more compact two-layer student model, comparing the performance impacts of distillation versus non-distilled models.
-
Updated
May 11, 2024 - Python
Attention-guided Feature Distillation for Semantic Segmentation
-
Updated
May 11, 2024 - Python
Improve this page
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."