Connect home devices into a powerful cluster to accelerate LLM inference. More devices means faster inference.
-
Updated
Mar 4, 2025 - C++
Connect home devices into a powerful cluster to accelerate LLM inference. More devices means faster inference.
✨ Kubectl plugin to create manifests with LLMs
🏗️ Fine-tune, build, and deploy open-source LLMs easily!
The code for HerO: a fact-checking pipeline based on open LLMs (the runner-up in AVeriTeC)
This repository contains a collection of tools exclusively designed for use with Open WebUI.
Add a description, image, and links to the open-llm topic page so that developers can more easily learn about it.
To associate your repository with the open-llm topic, visit your repo's landing page and select "manage topics."