You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A versatile CLI and Python wrapper for Perplexity's suite of large language models including their flagship Chat and Online 'Sonar Llama-3' models along with `LLama-3 and 'Mixtral'. Streamline the creation of chatbots, and search the web with AI (in real-time) with ease.
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
Welcome to the Mixtral 8x7B offloading demo repository! This project aims to demonstrate the seamless execution of Mixtral-8x7B models on Colab or consumer desktops.