We currently support loading the following checkpoint via T5.from_pretrained(identifier)
- t5-small
- t5-base
- t5-large
- t5-3b
- t5-11b
model_center.model.T5Config
model_center.model.T5
The current implementation is mainly an alias to T5Tokenizer of Hugging Face Transformers. we will change to our SAM implementation in the future, which will be a more efficient tokenizer.