Skip to content

Commit

Permalink
Add RWKV models
Browse files Browse the repository at this point in the history
  • Loading branch information
guangyusong committed Jun 5, 2023
1 parent c794017 commit 2a08824
Show file tree
Hide file tree
Showing 2 changed files with 25 additions and 0 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,7 @@ The following table shows the supported models with sizes and the tasks that the
| GPT-NeoX | 20B | Pretrained |
| GPT-Neo | 1.3B | Pretrained |
| GPT-J | 6B | Pretrained |
| RWKV | 169M, 430M, 1.5B, 3B, 7B, 14B 14B | Pretrained |
| Incoder | 6B | Pretrained |
| CodeParrot | Small-python (110M), Small-multi(110M), 1.5B | Pretrained |
| CodeBERT | CodeBERT-base, UnixCoder-base, CodeBERTa-small | Pretrained |
Expand Down
24 changes: 24 additions & 0 deletions codetf/configs/inference/causal_lm.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -68,4 +68,28 @@ causallm-codegen2-7B-pretrained:
causallm-codegen2-16B-pretrained:
huggingface_url: "Salesforce/codegen2-16B"
tokenizer_url: "Salesforce/codegen2-16B"
max_prediction_length: 512
causallm-rwkv-169M-pretrained:
huggingface_url: "RWKV/rwkv-4-169m-pile"
tokenizer_url: "RWKV/rwkv-4-169m-pile"
max_prediction_length: 512
causallm-rwkv-430M-pretrained:
huggingface_url: "RWKV/rwkv-4-430m-pile"
tokenizer_url: "RWKV/rwkv-4-430m-pile"
max_prediction_length: 512
causallm-rwkv-1.5B-pretrained:
huggingface_url: "RWKV/rwkv-raven-1b5"
tokenizer_url: "RWKV/rwkv-raven-1b5"
max_prediction_length: 512
causallm-rwkv-3B-pretrained:
huggingface_url: "RWKV/rwkv-raven-3b"
tokenizer_url: "RWKV/rwkv-raven-3b"
max_prediction_length: 512
causallm-rwkv-7B-pretrained:
huggingface_url: "RWKV/rwkv-raven-7b"
tokenizer_url: "RWKV/rwkv-raven-7b"
max_prediction_length: 512
causallm-rwkv-14B-pretrained:
huggingface_url: "RWKV/rwkv-raven-14b"
tokenizer_url: "RWKV/rwkv-raven-14b"
max_prediction_length: 512

0 comments on commit 2a08824

Please sign in to comment.