(Work in Progress) A cross-platform desktop client for offline LlaMA-CPU
-
Updated
Nov 1, 2023 - C#
(Work in Progress) A cross-platform desktop client for offline LlaMA-CPU
Llama.cpp in Unity, straightforward and clean
比简单更简单,通过 Ollama 不需要显卡轻松在你的电脑上运行 LLM。
openai chatgpt or local llm(llama.cpp gguf format)+TTS+STT+Word+Excel
.NET wrapper for LLaMA.cpp for LLaMA language model inference on CPU. 🦙
ASP.NET Core Web, WebApi & WPF implementations for LLama.cpp & LLamaSharp
Add a description, image, and links to the llamacpp topic page so that developers can more easily learn about it.
To associate your repository with the llamacpp topic, visit your repo's landing page and select "manage topics."