-
💻 Computational scientist
@ Argonne National Laboratory (ALCF) -
🧪 Interested in:
- {AI, HPC} for science
- 🚀 scaling large models across thousands of GPUs
Pinned Loading
-
deepspeedai/DeepSpeed
deepspeedai/DeepSpeed PublicDeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
-
huggingface/nanotron
huggingface/nanotron PublicMinimalistic large language model 3D-parallelism training
-
argonne-lcf/Megatron-DeepSpeed
argonne-lcf/Megatron-DeepSpeed PublicForked from deepspeedai/Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
-
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.