[ICML2024 (Oral)] Official PyTorch implementation of DoRA: Weight-Decomposed Low-Rank Adaptation
-
Updated
Jul 6, 2024 - Python
[ICML2024 (Oral)] Official PyTorch implementation of DoRA: Weight-Decomposed Low-Rank Adaptation
[ICCV 2023] Binary Adapters, [AAAI 2023] FacT, [Tech report] Convpass
Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning
Official implementation for CVPR'23 paper "BlackVIP: Black-Box Visual Prompting for Robust Transfer Learning"
Official implementation of AAAI 2023 paper "Parameter-efficient Model Adaptation for Vision Transformers"
On Transferability of Prompt Tuning for Natural Language Processing
[ICLR 2024] This is the repository for the paper titled "DePT: Decomposed Prompt Tuning for Parameter-Efficient Fine-tuning"
[NeurIPS2023] Parameter-efficient Tuning of Large-scale Multimodal Foundation Model
Code for paper "UniPELT: A Unified Framework for Parameter-Efficient Language Model Tuning", ACL 2022
[CVPR2024] The code of "UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory"
[ICML 2024] Official code for the paper "Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark ".
ZhiJian: A Unifying and Rapidly Deployable Toolbox for Pre-trained Model Reuse
[Preprint] AdaVAE: Exploring Adaptive GPT-2s in VAEs for Language Modeling PyTorch Implementation
Multi-domain Recommendation with Adapter Tuning
Code for the Findings of NAACL 2022(Long Paper): AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks
[MICCAI ISIC Workshop 2023 (best paper)] AViT: Adapting Vision Transformers for Small Skin Lesion Segmentation Datasets (an official implementation)
[NeurIPS-2022] Annual Conference on Neural Information Processing Systems
This is AlpaGasus2-QLoRA based on LLaMA2 with AlpaGasus mechanism using QLoRA!
PANDA: Prompt Transfer Meets Knowledge Distillation for Efficient Model Adaptation
Official implementation of CVPR 2024 paper "Prompt Learning via Meta-Regularization".
Add a description, image, and links to the parameter-efficient-tuning topic page so that developers can more easily learn about it.
To associate your repository with the parameter-efficient-tuning topic, visit your repo's landing page and select "manage topics."