🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
-
Updated
Jun 26, 2024 - Python
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
[ICLR 2023 Spotlight] Vision Transformer Adapter for Dense Predictions
Adapting Segment Anything Model for Medical Image Segmentation
Adapting Meta AI's Segment Anything to Downstream Tasks with Adapters and Prompts
[NeurIPS 2022] Implementation of "AdaptFormer: Adapting Vision Transformers for Scalable Visual Recognition"
Design Pattern that described by Python, This is the source code for the book of Everybody Know Design Patterns.
codelab_adapter extensions
Official repository for the ICLR 2024 paper "Towards Seamless Adaptation of Pre-trained Models for Visual Place Recognition".
ACM MM'23 (oral), SUR-adapter for pre-trained diffusion models can acquire the powerful semantic understanding and reasoning capabilities from large language models to build a high-quality textual semantic representation for text-to-image generation.
Official repository for the CVPR 2024 paper "CricaVPR: Cross-image Correlation-aware Representation Learning for Visual Place Recognition".
diffusion lora chinese tutorial,虚拟idol训练中文教程
A versatile sequenced read processor for nanopore direct RNA sequencing
[ICCV 2023 oral] This is the official repository for our paper: ''Sensitivity-Aware Visual Parameter-Efficient Fine-Tuning''.
[CVPR 2024] Memory-based Adapters for Online 3D Scene Perception
Use QLoRA to tune LLM in PyTorch-Lightning w/ Huggingface + MLflow
Code for the ACL 2022 paper "Continual Sequence Generation with Adaptive Compositional Modules"
Add a description, image, and links to the adapter topic page so that developers can more easily learn about it.
To associate your repository with the adapter topic, visit your repo's landing page and select "manage topics."