Popular repositories Loading
-
lumi-llm-scaling
lumi-llm-scaling PublicForked from spyysalo/lumi-llm-scaling
Scripts and documentation on scaling large language model training on the LUMI supercomputer
Shell
-
lm-evaluation-harness
lm-evaluation-harness PublicForked from EleutherAI/lm-evaluation-harness
A framework for few-shot evaluation of autoregressive language models.
Python
-
-
Megatron-DeepSpeed
Megatron-DeepSpeed PublicForked from microsoft/Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Python 1
-
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.