-
Notifications
You must be signed in to change notification settings - Fork 31.4k
Closed
Description
System Info
transformersversion: 4.28.1- Platform: Linux-5.15.0-1023-azure-x86_64-with-glibc2.17
- Python version: 3.8.16
- Huggingface_hub version: 0.14.1
- Safetensors version: not installed
- PyTorch version (GPU?): 1.13.1 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?:
- Using distributed or parallel set-up in script?:
Who can help?
@ArthurZucker and @younesbelkada
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examplesfolder (such as GLUE/SQuAD, ...) - My own task or dataset (give details below)
Reproduction
from transformers import AutoTokenizer,AutoConfig
models = [
"google/flan-t5-small",
"google/flan-t5-base",
"google/flan-t5-large",
"google/flan-t5-xl",
"google/flan-t5-xxl",
]
for model in models:
config = AutoConfig.from_pretrained(model)
tokenizer = AutoTokenizer.from_pretrained(model)
print(f"{model}\n\tlen(tokenizer)={len(tokenizer)},tokenizer.vocab_size={tokenizer.vocab_size},config.vocab_size={config.vocab_size}")Expected behavior
The two are matched.
patpizio
Metadata
Metadata
Assignees
Labels
No labels
