🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
-
Updated
Aug 7, 2024 - Python
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
[ICLR 2023 Spotlight] Vision Transformer Adapter for Dense Predictions
Adapting Segment Anything Model for Medical Image Segmentation
Adapting Meta AI's Segment Anything to Downstream Tasks with Adapters and Prompts
[NeurIPS 2022] Implementation of "AdaptFormer: Adapting Vision Transformers for Scalable Visual Recognition"
Design Pattern that described by Python, This is the source code for the book of Everybody Know Design Patterns.
Official repository for the ICLR 2024 paper "Towards Seamless Adaptation of Pre-trained Models for Visual Place Recognition".
codelab_adapter extensions
ACM MM'23 (oral), SUR-adapter for pre-trained diffusion models can acquire the powerful semantic understanding and reasoning capabilities from large language models to build a high-quality textual semantic representation for text-to-image generation.
Official repository for the CVPR 2024 paper "CricaVPR: Cross-image Correlation-aware Representation Learning for Visual Place Recognition".
diffusion lora chinese tutorial,虚拟idol训练中文教程
A versatile sequenced read processor for nanopore direct RNA sequencing
[CVPR 2024] Memory-based Adapters for Online 3D Scene Perception
[ICCV 2023 oral] This is the official repository for our paper: ''Sensitivity-Aware Visual Parameter-Efficient Fine-Tuning''.
Use QLoRA to tune LLM in PyTorch-Lightning w/ Huggingface + MLflow
A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
Add a description, image, and links to the adapter topic page so that developers can more easily learn about it.
To associate your repository with the adapter topic, visit your repo's landing page and select "manage topics."