Welcome to the Mixtral 8x7B offloading demo repository! This project aims to demonstrate the seamless execution of Mixtral-8x7B models on Colab or consumer desktops.
-
Updated
Dec 30, 2023 - Python
Welcome to the Mixtral 8x7B offloading demo repository! This project aims to demonstrate the seamless execution of Mixtral-8x7B models on Colab or consumer desktops.
AI Voice-Powered TODO app
A Python module for running the Mixtral-8x7B language model with customisable precision and attention mechanisms.
Crew of AI Agents that investigate a company to help you prepare for your next interview
working on llm research
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
This project aims to build a RAG model to chat with your PDFs
XMPP Bot designed for E2EE AI language model interactions
ACL 2024 (SRW), Official Codebase of our Paper: "MoExtend: Tuning New Experts for Modality and Task Extension"
Add a description, image, and links to the mixtral-8x7b-instruct topic page so that developers can more easily learn about it.
To associate your repository with the mixtral-8x7b-instruct topic, visit your repo's landing page and select "manage topics."