Skip to content

Mixtre of Experts Library forked from kotoba-recipes

Notifications You must be signed in to change notification settings

kotoba-tech/moe-recipes

Repository files navigation

Kotoba Recipes

Table of Contents

  1. Installation
  2. Instruction Tuning
  3. LLM Continual Pre-Training
  4. Support Models

Installation

To install the package, run the following command:

pip install -r requirements.txt

If you want to use the library in multi-nodes, you need to install the below packages:

module load openmpi/4.x.x

pip install mpi4py

FlashAttention

To install the FlashAttention, run the following command: (GPU is required)

pip install ninja packaging wheel
pip install flash-attn --no-build-isolation

ABCI

If you use ABCI to run the experiments, install scripts are available in kotoba-recipes/install.sh.

Instruction Tuning

scripts/abci/instruction contains the scripts to run instruction tunings on ABCI.

If you want to use custom instructions, you need to modify the src/llama_recipes/datasets/alpaca_dataset.py.

LLM Continual Pre-Training

scripts/abci/ contains the scripts to run LLM continual pre-training on ABCI. 7B, 13B, 70B directories contain the scripts to run the experiments with the corresponding model size (Llama 2).

Support Models

About

Mixtre of Experts Library forked from kotoba-recipes

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages