<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
t5-small-finetuned-epoch15
This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.4424
- Rouge1: 30.6227
- Rouge2: 18.36
- Rougel: 27.2654
- Rougelsum: 29.1342
- Gen Len: 18.9921
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
---|---|---|---|---|---|---|---|---|
1.9685 | 1.0 | 765 | 1.5842 | 29.1169 | 16.8705 | 25.8125 | 27.5815 | 18.9882 |
1.749 | 2.0 | 1530 | 1.5374 | 29.5968 | 17.2047 | 26.2123 | 28.0529 | 18.9948 |
1.7152 | 3.0 | 2295 | 1.5102 | 29.9897 | 17.6648 | 26.6741 | 28.4953 | 18.9895 |
1.6855 | 4.0 | 3060 | 1.4932 | 30.1755 | 17.8401 | 26.8169 | 28.6383 | 18.9928 |
1.6613 | 5.0 | 3825 | 1.4809 | 30.2173 | 17.8557 | 26.7923 | 28.6687 | 18.9902 |
1.644 | 6.0 | 4590 | 1.4716 | 30.3471 | 18.0114 | 26.9183 | 28.7556 | 18.9908 |
1.6306 | 7.0 | 5355 | 1.4647 | 30.3652 | 18.0496 | 26.9557 | 28.826 | 18.9915 |
1.6153 | 8.0 | 6120 | 1.4588 | 30.5138 | 18.2045 | 27.1705 | 29.0017 | 18.9908 |
1.6128 | 9.0 | 6885 | 1.4542 | 30.5644 | 18.2632 | 27.203 | 29.0705 | 18.9921 |
1.5992 | 10.0 | 7650 | 1.4492 | 30.5234 | 18.1797 | 27.1541 | 29.0238 | 18.9908 |
1.6088 | 11.0 | 8415 | 1.4461 | 30.6294 | 18.2865 | 27.2232 | 29.1328 | 18.9921 |
1.5933 | 12.0 | 9180 | 1.4448 | 30.5945 | 18.2752 | 27.2145 | 29.1082 | 18.9921 |
1.5851 | 13.0 | 9945 | 1.4431 | 30.6254 | 18.3252 | 27.2556 | 29.1329 | 18.9921 |
1.5869 | 14.0 | 10710 | 1.4429 | 30.5997 | 18.3426 | 27.2532 | 29.1219 | 18.9921 |
1.5794 | 15.0 | 11475 | 1.4424 | 30.6227 | 18.36 | 27.2654 | 29.1342 | 18.9921 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu116
- Tokenizers 0.13.2