llm-ops
Here are 14 public repositories matching this topic...
AIConfig is a config-based framework to build generative AI applications.
-
Updated
Jun 4, 2024 - Python
An end-to-end LLM reference implementation providing a Q&A interface for Airflow and Astronomer
-
Updated
Jul 16, 2024 - Python
Friendli: the fastest serving engine for generative AI
-
Updated
Aug 30, 2024 - Python
The prompt engineering, prompt management, and prompt evaluation tool for TypeScript, JavaScript, and NodeJS.
-
Updated
Sep 14, 2024 - TypeScript
The collaborative spreadsheet for AI. Chain cells into powerful pipelines, experiment with prompts and models, and evaluate LLM responses in real-time. Work together seamlessly to build and iterate on AI applications.
-
Updated
Sep 23, 2024 - Python
Miscellaneous codes and writings for MLOps
-
Updated
Sep 24, 2024 - Jupyter Notebook
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
-
Updated
Sep 25, 2024 - Python
Python SDK for running evaluations on LLM generated responses
-
Updated
Sep 26, 2024 - Python
Run any open-source LLMs, such as Llama 3.1, Gemma, as OpenAI compatible API endpoint in the cloud.
-
Updated
Sep 27, 2024 - Python
cluster/scheduler health monitoring for GPU jobs on k8s
-
Updated
Sep 27, 2024 - Python
Open-source alternative to Assistant's API with a managed backend for memory, RAG, tools and tasks. ~Supabase for building AI agents.
-
Updated
Sep 28, 2024 - Python
Improve this page
Add a description, image, and links to the llm-ops topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llm-ops topic, visit your repo's landing page and select "manage topics."