This project is an implementation of the paper: Parameter-Efficient Transfer Learning for NLP, Houlsby [Google], ICML 2019.
-
Updated
Mar 17, 2024 - Python
This project is an implementation of the paper: Parameter-Efficient Transfer Learning for NLP, Houlsby [Google], ICML 2019.
Master Thesis on "Comparing Modular Approaches for Parameter-Efficient Fine-Tuning"
Code for SAFT: Self-Attention Factor-Tuning, a 16x more efficient solution for fine-tuning neural networks
The code for generating natural distribution shifts on image and text datasets.
Low Tensor Rank adaptation of large language models
The code for the paper "Instance-aware Dynamic Prompt Tuning for Pre-trained Point Cloud Models" (ICCV'23).
Code for fine-tuning Llama2 LLM with custom text dataset to produce film character styled responses
Code for EACL'23 paper "Udapter: Efficient Domain Adaptation Using Adapters"
Applied Deep Learning 深度學習之應用 by Vivian Chen 陳縕儂 at NTU CSIE
Easy wrapper for inserting LoRA layers in CLIP.
PANDA: Prompt Transfer Meets Knowledge Distillation for Efficient Model Adaptation
Official implementation of CVPR 2024 paper "Prompt Learning via Meta-Regularization".
This is AlpaGasus2-QLoRA based on LLaMA2 with AlpaGasus mechanism using QLoRA!
[NeurIPS-2022] Annual Conference on Neural Information Processing Systems
[MICCAI ISIC Workshop 2023 (best paper)] AViT: Adapting Vision Transformers for Small Skin Lesion Segmentation Datasets (an official implementation)
Code for the Findings of NAACL 2022(Long Paper): AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks
Multi-domain Recommendation with Adapter Tuning
[Preprint] AdaVAE: Exploring Adaptive GPT-2s in VAEs for Language Modeling PyTorch Implementation
ZhiJian: A Unifying and Rapidly Deployable Toolbox for Pre-trained Model Reuse
[ICML 2024] Official code for the paper "Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark ".
Add a description, image, and links to the parameter-efficient-tuning topic page so that developers can more easily learn about it.
To associate your repository with the parameter-efficient-tuning topic, visit your repo's landing page and select "manage topics."