Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix
block_size
picking in megatron_lm_gpt_pretraining.py (#2342)
Only cap `block_size` to 1024 if `tokenizer.model_max_length` is actually greater than 1024.
- Loading branch information