Skip to content

Commit

Permalink
Update README
Browse files Browse the repository at this point in the history
  • Loading branch information
duzx16 committed Mar 14, 2023
1 parent 4b65bdb commit 4f61ed7
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,13 @@ various natural language understanding and generation tasks.

Please refer to our paper for a detailed description of GLM:

[GLM: General Language Model Pretraining with Autoregressive Blank Infilling](https://arxiv.org/abs/2103.10360) (ACL
2022)
[GLM: General Language Model Pretraining with Autoregressive Blank Infilling](https://arxiv.org/abs/2103.10360) (ACL 2022)

Zhengxiao Du*, Yujie Qian*, Xiao Liu, Ming Ding, Jiezhong Qiu, Zhilin Yang, Jie Tang (*: equal contribution)

**We release [GLM-130B](https://github.com/THUDM/GLM-130B), an open bilingual (English & Chinese) pre-trained language
model wit 130 billion parameters based on the GLM framework.**
**News: We release [ChatGLM-6B](https://github.com/THUDM/ChatGLM-6B), an open pre-trained language model with 6 billion parameters optimized for Chinese QA and dialogue based on the GLM framework.**

[//]: # (**We release [GLM-130B](https://github.com/THUDM/GLM-130B), an open bilingual (English & Chinese) pre-trained language model with 130 billion parameters based on the GLM framework.**)

## Pretrained Models

Expand Down

0 comments on commit 4f61ed7

Please sign in to comment.