Skip to content

A toolkit for large scale distributed training

License

Notifications You must be signed in to change notification settings

BigBird01/LASER

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LASER (a toolkit for Large scAle diStributEd tRaining)

A toolkit for large scale distributed training

With LARSER we succeeded to train DeBERTa 1.5B model without model parallelism. The DeBERTa 1.5B model is the SOAT model on GLUE and SuperGLUE leaderboard. And it's the first model that surpass T5 11B model and human performance on SuperGLUE leaderboard.

TODOs

  • Add documentation and usage examples

About

A toolkit for large scale distributed training

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages