Unify Efficient Fine-Tuning of 100+ LLMs
-
Updated
May 26, 2024 - Python
Unify Efficient Fine-Tuning of 100+ LLMs
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
The official GitHub page for the survey paper "A Survey of Large Language Models".
Aligning pretrained language models with instruction data generated by themselves.
🦦 Otter, a multi-modal model based on OpenFlamingo (open-sourced version of DeepMind's Flamingo), trained on MIMIC-IT and showcasing improved instruction-following and in-context learning ability.
Code and models for NExT-GPT: Any-to-Any Multimodal Large Language Model
Video-LLaVA: Learning United Visual Representation by Alignment Before Projection
mPLUG-Owl & mPLUG-Owl2: Modularized Multimodal Large Language Model
InternLM-XComposer2 is a groundbreaking vision-language large model (VLLM) excelling in free-form text-image composition and comprehension.
A one-stop data processing system to make data higher-quality, juicier, and more digestible for LLMs! 🍎 🍋 🌽 ➡️ ➡️🍸 🍹 🍷为大语言模型提供更高质量、更丰富、更易”消化“的数据!
An Open-sourced Knowledgable Large Language Model Framework.
Video Foundation Models & Data for Multimodal Understanding
DataDreamer: Prompt. Generate Synthetic Data. Train & Align Models. 🤖💤
DISC-FinLLM,中文金融大语言模型(LLM),旨在为用户提供金融场景下专业、智能、全面的金融咨询服务。DISC-FinLLM, a Chinese financial large language model (LLM) designed to provide users with professional, intelligent, and comprehensive financial consulting services in financial scenarios.
DialogStudio: Towards Richest and Most Diverse Unified Dataset Collection and Instruction-Aware Models for Conversational AI
[SIGIR'2024] "GraphGPT: Graph Instruction Tuning for Large Language Models"
Papers and Datasets on Instruction Tuning and Following. ✨✨✨
Deita: Data-Efficient Instruction Tuning for Alignment [ICLR2024]
CIKM2023 Best Demo Paper Award. HugNLP is a unified and comprehensive NLP library based on HuggingFace Transformer. Please hugging for NLP now!😊
MindSpore online courses: Step into LLM
Add a description, image, and links to the instruction-tuning topic page so that developers can more easily learn about it.
To associate your repository with the instruction-tuning topic, visit your repo's landing page and select "manage topics."