This repository contains examples for running Deep Learning training jobs using AWS Trainium instances and Amazon SageMaker.
-
Text classification using Transformers - This example shows how we can train and deploy a BERT based classification model which use Amazon Polarity dataset.
-
Pretrain BERT using Wiki Data - The example can be used to Pretrain BERT model using AWS Trainium. We will use wiki data to run the pretraining.
-
Pretrain Llama 2 70b using Wikicorpus - The example helps to pretrain llama2 models using NeuronX distributed library.
-
Continual Pretraining of Llama 2 70b using Wikicorpus - The example helps to continuous pretrain llama2 models using NeuronX distributed library.
-
Pretrain/Fine tune Llama using Wiki Data - The example can be used to pretrain/fine tune llama2 7b model using Neuronx nemo megatron library.
See CONTRIBUTING for more information.
This library is licensed under the MIT-0 License. See the LICENSE file.