Harness LLMs with Multi-Agent Programming
-
Updated
Jun 12, 2024 - Python
Harness LLMs with Multi-Agent Programming
No-code multi-agent framework to build LLM Agents, workflows and applications with your data
[ICML 2024] LLMCompiler: An LLM Compiler for Parallel Function Calling
The all-in-one LLM developer platform: prompt management, evaluation, human feedback, and deployment all in one place.
The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
Official Repo for ICML 2024 paper "Executable Code Actions Elicit Better LLM Agents" by Xingyao Wang, Yangyi Chen, Lifan Yuan, Yizhe Zhang, Yunzhu Li, Hao Peng, Heng Ji.
Tune LLM in few lines of code
On-Call/DevOps Assistant - Get a head start on fixing alerts with AI investigation
AI-to-AI Testing | Simulation framework for LLM-based applications
Design, conduct and analyze results of AI-powered surveys and experiments. Simulate social science and market research with large numbers of AI agents and LLMs.
Geniusrise: Framework for building geniuses
GoalChain for goal-orientated LLM conversation flows
SecGPT: An execution isolation architecture for LLM-based systems
Experiment on QnA tabular data using LLMs and SQL
LLMs prompt augmentation with RAG by integrating external custom data from a variety of sources, allowing chat with such documents
Unstract's interface to LLMs, Embeddings and VectorDBs.
A framework for writing Unstract Tools/Apps
Bolts that read data and perform chains of actions with prompts
A rag application for enterprise clients.
A collection of Spouts that query databases
Add a description, image, and links to the llm-framework topic page so that developers can more easily learn about it.
To associate your repository with the llm-framework topic, visit your repo's landing page and select "manage topics."