This project is an implementation of the paper: Parameter-Efficient Transfer Learning for NLP, Houlsby [Google], ICML 2019.
-
Updated
Mar 17, 2024 - Python
This project is an implementation of the paper: Parameter-Efficient Transfer Learning for NLP, Houlsby [Google], ICML 2019.
CausalLM for python docstrings documentation
An simple trainer for efficient finetuning large models on different tasks
Comparison of different adaptation methods on PEFT for fine-tuning downstream tasks or benchmarks.
Enhancing Large Language Models' Utility for Medical Question-Answering: A Patient Health Question Summarization Approach
High Quality Image Generation Model - Comes Under NGC Models @prithivmlmods
Low Tensor Rank adaptation of large language models
This is a task scheduling simulation platform for general heterogeneous computing platforms.
Parameter Efficient Fine-tuning of Self-supervised ViTs without Catastrophic Forgetting
Code for fine-tuning Llama2 LLM with custom text dataset to produce film character styled responses
A python app with CLI interface to do local inference and testing of open source LLMs for text-generation. Test any transformer LLM community model such as GPT-J, Pythia, Bloom, LLaMA, Vicuna, Alpaca, or any other model supported by Huggingface's transformer and run model locally in your computer without the need of 3rd party paid APIs or keys.
Clipora is a powerful toolkit for fine-tuning OpenCLIP models using Low Rank Adapters (LoRA).
Official code of EFFT (arXiv:2311.06749)
Discrete Bayesian optimization with LLMs, PEFT finetuning methods, and the Laplace approximation.
Official code implemtation of paper AntGPT: Can Large Language Models Help Long-term Action Anticipation from Videos?
IISAN: Efficiently Adapting Multimodal Representation for Sequential Recommendation with Decoupled PEFT
Add a description, image, and links to the peft topic page so that developers can more easily learn about it.
To associate your repository with the peft topic, visit your repo's landing page and select "manage topics."