LlamaIndex is a data framework for your LLM applications
-
Updated
Jul 16, 2024 - Python
LlamaIndex is a data framework for your LLM applications
A WebUI for Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
Low-code framework for building custom LLMs, neural networks, and other AI models
🔥🔥High-Performance Face Recognition Library on PaddlePaddle & PyTorch🔥🔥
Distributed ML Training and Fine-Tuning on Kubernetes
Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
Scalable and flexible workflow orchestration platform that seamlessly unifies data, ML and analytics stacks.
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
LLM Finetuning with peft
Using Low-rank adaptation to quickly fine-tune diffusion models.
Fine-tuning ChatGLM-6B with PEFT | 基于 PEFT 的高效 ChatGLM 微调
🦖 𝗟𝗲𝗮𝗿𝗻 about 𝗟𝗟𝗠𝘀, 𝗟𝗟𝗠𝗢𝗽𝘀, and 𝘃𝗲𝗰𝘁𝗼𝗿 𝗗𝗕𝘀 for free by designing, training, and deploying a real-time financial advisor LLM system ~ 𝘴𝘰𝘶𝘳𝘤𝘦 𝘤𝘰𝘥𝘦 + 𝘷𝘪𝘥𝘦𝘰 & 𝘳𝘦𝘢𝘥𝘪𝘯𝘨 𝘮𝘢𝘵𝘦𝘳𝘪𝘢𝘭𝘴
Your Automatic Prompt Engineering Assistant for GenAI Applications
H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://h2oai.github.io/h2o-llmstudio/
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
A comprehensive guide to building RAG-based LLM applications for production.
This repo contains a PyTorch implementation of a pretrained BERT model for multi-label text classification.
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
Multi-lingual large voice generation model, providing inference, training and deployment full-stack ability.
Add a description, image, and links to the fine-tuning topic page so that developers can more easily learn about it.
To associate your repository with the fine-tuning topic, visit your repo's landing page and select "manage topics."