llm-inference
Here are 7 public repositories matching this topic...
A terminal style user interface to chat with AI characters using llama LLMs for locally processed AI.
-
Updated
Mar 21, 2024 - Rust
ChatFlameBackend is an innovative backend solution for chat applications, leveraging the power of the Candle AI framework with a focus on the Mistral model
-
Updated
Jan 21, 2024 - Rust
Lightweight and extensible LLM Inference serving benchmark tool written in Rust.
-
Updated
Apr 4, 2024 - Rust
A minimalistic LLM-powered Telegram assistant written in Rust that uses a self-contained Sqlite database and is very easy to install.
-
Updated
May 5, 2024 - Rust
AICI: Prompts as (Wasm) Programs
-
Updated
Jun 8, 2024 - Rust
Improve this page
Add a description, image, and links to the llm-inference topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llm-inference topic, visit your repo's landing page and select "manage topics."