Skip to content
/ libai Public
forked from Oneflow-Inc/libai

LiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training

License

Notifications You must be signed in to change notification settings

ccssu/libai

 
 

Repository files navigation

LiBai

docs GitHub GitHub release PRs Welcome Python Checks Docs Release Status

Introduction

English | 简体中文

LiBai is a large-scale open-source model training toolbox based on OneFlow. The main branch works with OneFlow 0.7.0.

Highlights
  • Support a collection of parallel training components

    LiBai provides multiple parallelisms such as Data Parallelism, Tensor Parallelism, and Pipeline Parallelism. It's also extensible for other new parallelisms.

  • Varied training techniques

    LiBai provides many out-of-the-box training techniques such as Distributed Training, Mixed Precision Training, Activation Checkpointing, Recomputation, Gradient Accumulation, and Zero Redundancy Optimizer(ZeRO).

  • Support for both CV and NLP tasks

    LiBai has predifined data process for both CV and NLP datasets such as CIFAR, ImageNet, and BERT Dataset.

  • Easy to use

    LiBai's components are designed to be modular for easier usage as follows:

    • LazyConfig system for more flexible syntax and no predefined structures
    • Friendly trainer and engine
    • Used as a library to support building research projects on it. See projects/ for some projects that are built based on LiBai
  • High Efficiency

Installation

See Installation instructions.

Getting Started

See Quick Run for the basic usage of LiBai.

Documentation

See LiBai's documentation for full API documentation and tutorials.

ChangeLog

Beta 0.2.0 was released in 07/07/2022, the general changes in 0.2.0 version are as follows:

Features:

  • Support evaluation enabled and set eval_iter
  • Support customized sampler in config.py
  • Support rdma for pipeline-model-parallel
  • Support multi fused kernel
    • fused_scale_mask_softmax_dropout
    • fused_scale_tril_softmax_mask_scale
    • fused_self_attention in branch libai_bench
  • User Experience Optimization
  • Optimization for training throughput, see benchmark for more details

Supported Models:

  • Support 3D parallel Roberta model
  • Support 2D parallel (data parallel + tensor model parallel) SimCSE model
  • Support Data parallel MAE model
  • Support Data parallel MOCOV3 model

See changelog for details and release history.

Contributing

We appreciate all contributions to improve LiBai. See CONTRIBUTING for the contributing guideline.

License

This project is released under the Apache 2.0 license.

Citation

If you find this project useful for your research, consider cite:

@misc{of2021libai,
  author =       {Xingyu Liao and Peng Cheng and Tianhe Ren and Depeng Liang and
                  Kai Dang and Yi Wang and Xiaoyu Xu},
  title =        {LiBai},
  howpublished = {\url{https://github.com/Oneflow-Inc/libai}},
  year =         {2021}
}

Join the WeChat group

LiBai_Wechat_QRcode

About

LiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.5%
  • C++ 1.1%
  • Other 0.4%