A personal knowledge 🧠 base used to distil knowledge in to a atomic document 📄 using logseq
-
Updated
May 15, 2024 - CSS
A personal knowledge 🧠 base used to distil knowledge in to a atomic document 📄 using logseq
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
Gather research papers, corresponding codes (if having), reading notes and any other related materials about Hot🔥🔥🔥 fields in Computer Vision based on Deep Learning.
A treasure chest for visual classification and recognition powered by PaddlePaddle
[CVPR 2024] Source code for "Diffusion-Based Adaptation for Classification of Unknown Degraded Images".
模型压缩的小白入门教程
Knowledge Distillation from VGG16 (teacher model) to MobileNet (student model)
A curated list for Efficient Large Language Models
An implementation of the KAN architecture using learnable activation functions for knowledge distillation on the MNIST handwritten digits dataset. The project demonstrates distilling a three-layer teacher KAN model into a more compact two-layer student model, comparing the performance impacts of distillation versus non-distilled models.
Attention-guided Feature Distillation for Semantic Segmentation
Code for CVPR'24 Paper: Segment Any Event Streams via Weighted Adaptation of Pivotal Tokens
[CVPR 2024] Official PyTorch Code for "PromptKD: Unsupervised Prompt Distillation for Vision-Language Models"
[ICCV 2023] - Zero-shot Composed Image Retrieval with Textual Inversion
AI book for everyone
[CVPR 2024 Highlight] Logit Standardization in Knowledge Distillation
NLP, Knowledge Distillation, pruning
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
The Application of Knowledge Distillation-Based Mask RCNN for Object Detection and Instance Segmentation
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."