Language translation using an encoder-decoder transformer model trained using distributed data parallel on compute canada clusters. Sample training done on 16 Tesla V100-SXM2-32GB GPUs spread across 4 nodes. To run execute either of the .sh
files in the slurm
folder using the sbatch
command
-
Notifications
You must be signed in to change notification settings - Fork 0
Zafirmk/miniGPT
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Pytorch data distributed parallel (DDP) implementation of a text-based transformer model from scratch.