Unify Efficient Fine-Tuning of 100+ LLMs
-
Updated
Jun 8, 2024 - Python
Unify Efficient Fine-Tuning of 100+ LLMs
Mixture-of-Experts for Large Vision-Language Models
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
Tutel MoE: An Optimized Mixture-of-Experts Implementation
MindSpore online courses: Step into LLM
中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*
Batch download high quality videos from https://twist.moe
ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters.
pytorch open-source library for the paper "AdaTT Adaptive Task-to-Task Fusion Network for Multitask Learning in Recommendations"
Implementation of Switch Transformers from the paper: "Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity"
[ICLR 2023] "Sparse MoE as the New Dropout: Scaling Dense and Self-Slimmable Transformers" by Tianlong Chen*, Zhenyu Zhang*, Ajay Jaiswal, Shiwei Liu, Zhangyang Wang
[Preprint] Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
Implementation of the "the first large-scale multimodal mixture of experts models." from the paper: "Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts"
[arXiv'24] Multilinear Mixture of Experts: Scalable Expert Specialization through Factorization
Implementation of MoE Mamba from the paper: "MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts" in Pytorch and Zeta
PyTorch implementation of moe, which stands for mixture of experts
Add a description, image, and links to the moe topic page so that developers can more easily learn about it.
To associate your repository with the moe topic, visit your repo's landing page and select "manage topics."