mistral
Here are 434 public repositories matching this topic...
Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
-
Updated
Oct 30, 2024 - Python
🤖 The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference
-
Updated
Oct 31, 2024 - C++
Low-code framework for building custom LLMs, neural networks, and other AI models
-
Updated
Oct 28, 2024 - Python
Run any open-source LLMs, such as Llama 3.1, Gemma, as OpenAI compatible API endpoint in the cloud.
-
Updated
Oct 29, 2024 - Python
Firefly: 大模型训练工具,支持训练Qwen2.5、Qwen2、Yi1.5、Phi-3、Llama3、Gemma、MiniCPM、Yi、Deepseek、Orion、Xverse、Mixtral-8x7B、Zephyr、Mistral、Baichuan2、Llma2、Llama、Qwen、Baichuan、ChatGLM2、InternLM、Ziya2、Vicuna、Bloom等大模型
-
Updated
Oct 24, 2024 - Python
Generative AI suite powered by state-of-the-art models and providing advanced AI/AGI functions. It features AI personas, AGI functions, multi-model chats, text-to-image, voice, response streaming, code highlighting and execution, PDF import, presets for developers, much more. Deploy on-prem or in the cloud.
-
Updated
Oct 29, 2024 - TypeScript
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
-
Updated
Oct 31, 2024 - Python
Enchanted is iOS and macOS app for chatting with private self hosted language models such as Llama2, Mistral or Vicuna using Ollama.
-
Updated
Aug 16, 2024 - Swift
Efficient Triton Kernels for LLM Training
-
Updated
Oct 31, 2024 - Python
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
-
Updated
Sep 23, 2024 - Python
LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.
-
Updated
Sep 25, 2024 - Rust
🧑🚀 全世界最好的LLM资料总结 | Summary of the world's best LLM resources.
-
Updated
Oct 31, 2024
Python SDK for AI agent monitoring, LLM cost tracking, benchmarking, and more. Integrates with most LLMs and agent frameworks like CrewAI, Langchain, and Autogen
-
Updated
Oct 31, 2024 - Python
[ACL 2024] An Easy-to-use Knowledge Editing Framework for LLMs.
-
Updated
Oct 31, 2024 - Jupyter Notebook
Lightweight inference library for ONNX files, written in C++. It can run Stable Diffusion XL 1.0 on a RPI Zero 2 (or in 298MB of RAM) but also Mistral 7B on desktops and servers. ARM, x86, WASM, RISC-V supported. Accelerated by XNNPACK.
-
Updated
Oct 14, 2024 - C++
A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
-
Updated
Oct 10, 2024 - Python
Create chatbots with ease
-
Updated
Oct 15, 2024 - TypeScript
Improve this page
Add a description, image, and links to the mistral topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the mistral topic, visit your repo's landing page and select "manage topics."