Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals.
-
Updated
Jul 11, 2024 - Rust
Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals.
The GPU-powered AI application database. Get your app to market faster using the simplicity of SQL and the latest NLP, ML + LLM models.
A blazing fast inference solution for text embeddings models
AICI: Prompts as (Wasm) Programs
LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.
Scalable, Low-latency and Hybrid-enabled Vector Search in Postgres. Revolutionize Vector Search, not Database.
《构筑大语言模型应用:应用开发与架构设计》一本关于 LLM 在真实世界应用的开源电子书,介绍了大语言模型的基础知识和应用,以及如何构建自己的模型。其中包括Prompt的编写、开发和管理,探索最好的大语言模型能带来什么,以及LLM应用开发的模式和架构设计。
All-in-one infrastructure for building search, recommendations, and RAG. Trieve combines search language models with tools for tuning ranking and relevance.
A CLI tool to convert your codebase into a single LLM prompt with source tree, prompt templating, and token counting.
Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
Pure Rust implementation of a minimal Generative Pretrained Transformer
A realtime and indexing and structured extraction engine for Unstructured Data to build Generative AI Applications
LSP server leveraging LLMs for code completion (and more?)
Chat with an AI that knows everything about you. Record your screens & mics 24/7. You own your data. Rust. Library for devs to build AI apps on top of all your life data.
Generate music based on natural language prompts using LLMs running locally
Add a description, image, and links to the llm topic page so that developers can more easily learn about it.
To associate your repository with the llm topic, visit your repo's landing page and select "manage topics."