Skip to content

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.

Notifications You must be signed in to change notification settings

zongshuai818/Pretrained-Language-Model

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Pretrained Language Model

This repository provides the latest pretrained language models and its related optimization techniques developed by Huawei Noah's Ark Lab.

Directory structure

  • NEZHA-TensorFlow is a pretrained Chinese language model which achieves the state-of-the-art performances on several Chinese NLP tasks developed by TensorFlow.
  • NEZHA-PyTorch is the PyTorch version of NEZHA.
  • TinyBERT is a compressed BERT model which achieves 7.5x smaller and 9.4x faster on inference.

About

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.3%
  • Other 0.7%