mixtral-8x7b-instruct
Here are 10 public repositories matching this topic...
This project aims to build a RAG model to chat with your PDFs
-
Updated
May 23, 2024 - Python
working on llm research
-
Updated
Apr 28, 2024 - Python
Crew of AI Agents that investigate a company to help you prepare for your next interview
-
Updated
Mar 29, 2024 - Python
A Python module for running the Mixtral-8x7B language model with customisable precision and attention mechanisms.
-
Updated
Feb 6, 2024 - Python
AI Voice-Powered TODO app
-
Updated
Jan 3, 2024 - Python
Welcome to the Mixtral 8x7B offloading demo repository! This project aims to demonstrate the seamless execution of Mixtral-8x7B models on Colab or consumer desktops.
-
Updated
Dec 30, 2023 - Python
XMPP Bot designed for privacy AI language model interactions
-
Updated
May 12, 2024 - Python
A versatile CLI and Python wrapper for Perplexity's suite of large language models including their flagship Chat and Online 'Sonar Llama-3' models along with `LLama-3 and 'Mixtral'. Streamline the creation of chatbots, and search the web with AI (in real-time) with ease.
-
Updated
May 28, 2024 - Python
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
-
Updated
May 9, 2024 - Python
Improve this page
Add a description, image, and links to the mixtral-8x7b-instruct topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the mixtral-8x7b-instruct topic, visit your repo's landing page and select "manage topics."