中文Mixtral-8x7B(Chinese-Mixtral-8x7B)
-
Updated
Apr 2, 2024 - Python
中文Mixtral-8x7B(Chinese-Mixtral-8x7B)
🤖️ an AI chat Telegram bot can Web Search Powered by GPT-3.5/4/4 Turbo/4o, DALL·E 3, Groq, Gemini 1.5 Pro/Flash and the official Claude2.1/3/3.5 API using Python on Zeabur, fly.io and Replit.
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
Fast Inference of MoE Models with CPU-GPU Orchestration
Build LLM-powered robots in your garage with MachinaScript For Robots!
Train llm (bloom, llama, baichuan2-7b, chatglm3-6b) with deepspeed pipeline mode. Faster than zero/zero++/fsdp.
An innovative Python project that integrates AI-driven agents for Agile software development, leveraging advanced language models and collaborative task automation.
Reference implementation of Mistral AI 7B v0.1 model.
Tool for test diferents large language models without code.
LLMs prompt augmentation with RAG by integrating external custom data from a variety of sources, allowing chat with such documents
Turn any Youtube video into a nice blogpost, using Groq and Deepgram.
Chat with your PDF files for free, using Langchain, Groq, ChromaDB, and Jina AI embeddings.
A versatile CLI and Python wrapper for Groq AI's breakthrough LPU Inference Engine. Streamline the creation of chatbots and generate dynamic text with speeds of up to 800 tokens/sec.
XMPP Bot designed for E2EE AI language model interactions
Hello Github! I'm Surf, a friendly trained dolphin assistant here to help you with your coding needs. As an AI, I can understand and execute simple code snippets in multiple languages, including Python, JavaScript, and HTML. If you ever need assistance with coding tasks or debugging issues, just send me a message. Happy coding!
A Python module for running the Mixtral-8x7B language model with customisable precision and attention mechanisms.
Mistral and Mixtral (MoE) from scratch
Python Scripts that prompt GPT-3 via API for free with GPT4Free
Open-source project for running LLM agents in a user-friendly desktop app
Entrainer un LLM en local avec ses propres données
Add a description, image, and links to the mixtral-8x7b topic page so that developers can more easily learn about it.
To associate your repository with the mixtral-8x7b topic, visit your repo's landing page and select "manage topics."