summarization generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

mT5_multilingual_XLSum-sinhala-abstaractive-summarization_CNN-dailymail-V2

This model is a fine-tuned version of csebuetnlp/mT5_multilingual_XLSum on the CNN daily-mail sinhala dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
1.8746 1.0 750 1.8262 18.9753 7.9271 18.1349 18.7152
1.4727 2.0 1500 1.8094 19.2219 7.9749 18.4314 18.9405
1.2331 3.0 2250 1.8432 20.436 7.8378 19.584 20.1613
1.0381 4.0 3000 1.8987 20.2251 7.9593 19.1556 19.9829
0.8737 5.0 3750 1.9471 20.3262 7.8935 19.407 20.0628
0.7363 6.0 4500 2.0611 20.1551 7.5046 19.2213 19.963
0.6214 7.0 5250 2.1838 19.9045 7.6232 18.743 19.5983
0.5277 8.0 6000 2.3190 20.8581 8.1054 19.8079 20.5414
0.4576 9.0 6750 2.4091 20.028 7.7635 19.0721 19.7053
0.4099 10.0 7500 2.4863 19.9769 8.04 19.0307 19.7651

Framework versions