FineTuned facebook's BART-Model with 406 M parameters
TO ACCESS THE MODEL FILES please check out the HuggingFace Repository - https://huggingface.co/karthiksagarn/bart-samsum-finetuned
This model is a fine-tuned version of facebook/bart-large-cnn on the samsum dataset. It achieves the following results on the evaluation set.
Loss: 0.1326

TRAINING HYPERPARAMETERS :
The following hyperparameters were used during training:
ROUGE SCORES:

BERT SCORE:

Framework versions Transformers - 4.38.2 Pytorch - 2.2.1+cu121 Datasets - 2.18.0 Tokenizers - 0.15.2