DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
-
Updated
May 31, 2024 - Python
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
A library for easily merging multiple LLM experts, and efficiently train the merged LLM.
Repository for our paper "See More Details: Efficient Image Super-Resolution by Experts Mining"
[SIGIR'24] The official implementation code of MOELoRA.
[Paper][Preprint 2024] Mixture of Modality Knowledge Experts for Robust Multi-modal Knowledge Graph Completion
an LLM toolkit
Mistral and Mixtral (MoE) from scratch
MoE Decoder Transformer implementation with MLX
Simplified Implementation of SOTA Deep Learning Papers in Pytorch
RealCompo: Balancing Realism and Compositionality Improves Text-to-Image Diffusion Models
Early release of the official implementation for "GraphMETRO: Mitigating Complex Graph Distribution Shifts via Mixture of Aligned Experts"
Fast Inference of MoE Models with CPU-GPU Orchestration
Implementation of the "the first large-scale multimodal mixture of experts models." from the paper: "Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts"
Implementation of Switch Transformers from the paper: "Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity"
Mixture-of-Experts for Large Vision-Language Models
[ICML 2024] "MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts"
Tutel MoE: An Optimized Mixture-of-Experts Implementation
This is the repo for the MixKABRN Neural Network (Mixture of Kolmogorov-Arnold Bit Retentive Networks), and an attempt at first adapting it for training on text, and later adjust it for other modalities.
Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.
The implementation of "Leeroo Orchestrator: Elevating LLMs Performance Through Model Integration"
Add a description, image, and links to the mixture-of-experts topic page so that developers can more easily learn about it.
To associate your repository with the mixture-of-experts topic, visit your repo's landing page and select "manage topics."