peft
Here are 135 public repositories matching this topic...
Unify Efficient Fine-Tuning of 100+ LLMs
-
Updated
Jun 8, 2024 - Python
Speech, Language, Audio, Music Processing with Large Language Model
-
Updated
Jun 7, 2024 - Python
An efficient, flexible and full-featured toolkit for fine-tuning LLM (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
-
Updated
Jun 7, 2024 - Python
Firefly: 大模型训练工具,支持训练Qwen2、Yi1.5、Phi-3、Llama3、Gemma、MiniCPM、Yi、Deepseek、Orion、Xverse、Mixtral-8x7B、Zephyr、Mistral、Baichuan2、Llma2、Llama、Qwen、Baichuan、ChatGLM2、InternLM、Ziya2、Vicuna、Bloom等大模型
-
Updated
Jun 7, 2024 - Python
Finetuning coding LLM OpenCodeInterpreter-DS-6.7B for Text-to-SQL Code Generation on a Single A100 GPU in PyTorch.
-
Updated
Jun 6, 2024 - Jupyter Notebook
This repository is dedicated to small projects and some theoretical material that I used to get into NLP and LLM in a practical and efficient way.
-
Updated
Jun 6, 2024 - Jupyter Notebook
MindSpore online courses: Step into LLM
-
Updated
Jun 6, 2024 - Python
This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors" in ICML 2024.
-
Updated
Jun 5, 2024 - Python
[ICML'24 Oral] APT: Adaptive Pruning and Tuning Pretrained Language Models for Efficient Training and Inference
-
Updated
Jun 4, 2024 - Python
High Quality Image Generation Model - Comes Under NGC Models @prithivmlmods
-
Updated
Jun 4, 2024 - Python
a bro who codes with you
-
Updated
Jun 1, 2024 - TypeScript
Implementation for the different ML tasks on Kaggle platform with GPUs.
-
Updated
Jun 1, 2024 - Jupyter Notebook
PEFT is a wonderful tool that enables training a very large model in a low resource environment. Quantization and PEFT will enable widespread adoption of LLM.
-
Updated
May 31, 2024 - Jupyter Notebook
Fine Tuning pegasus and flan-t5 pre-trained language model on dialogsum datasets for conversation summarization to to optimize context window in RAG-LLMs
-
Updated
May 29, 2024 - Jupyter Notebook
🚂 Fine tuning large language models
-
Updated
May 28, 2024 - Jupyter Notebook
[SIGIR'24] The official implementation code of MOELoRA.
-
Updated
May 28, 2024 - Python
IISAN: Efficiently Adapting Multimodal Representation for Sequential Recommendation with Decoupled PEFT
-
Updated
May 26, 2024 - Python
Improve this page
Add a description, image, and links to the peft topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the peft topic, visit your repo's landing page and select "manage topics."