generated_from_trainer

📋 BUOD: distilBART Transformer Model

Model:distilBART Authors: James Esguerra, Julia Avila, Hazielle Bugayong

This model is a fine-tuned version of sshleifer/distilbart-cnn-12-6 on the KAMI-3000 dataset, for the task of Filipino Text Summarization.

It achieves the following results on the evaluation set:

🔧 Finetuning/ Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
2.1377 1.0 586 1.8792 49.8737 22.7881 33.6698 45.8037
1.5731 2.0 1172 1.8049 50.5143 23.2481 34.135 46.4261

Framework versions