<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
t5-base-finetuned-qg-medium-hard-qns
This model is a fine-tuned version of t5-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.5919
- Rouge1: 38.6117
- Rouge2: 21.3082
- Rougel: 35.7294
- Rougelsum: 35.4192
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 16
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
---|---|---|---|---|---|---|---|
No log | 1.0 | 73 | 1.8640 | 31.2085 | 11.6418 | 26.1137 | 26.2911 |
No log | 2.0 | 146 | 1.6488 | 29.6798 | 10.9223 | 26.7442 | 26.9736 |
No log | 3.0 | 219 | 1.6045 | 33.6703 | 11.7038 | 30.167 | 29.9192 |
No log | 4.0 | 292 | 1.5812 | 36.6758 | 17.748 | 33.739 | 33.4974 |
No log | 5.0 | 365 | 1.5879 | 33.3704 | 16.4099 | 31.7658 | 31.3874 |
No log | 6.0 | 438 | 1.5786 | 34.1216 | 14.9588 | 30.9584 | 30.9277 |
1.7533 | 7.0 | 511 | 1.5804 | 34.8267 | 15.7046 | 32.0877 | 31.9317 |
1.7533 | 8.0 | 584 | 1.5861 | 33.2539 | 12.728 | 30.551 | 30.2299 |
1.7533 | 9.0 | 657 | 1.5911 | 38.4406 | 20.5922 | 36.4267 | 36.0426 |
1.7533 | 10.0 | 730 | 1.5827 | 33.3421 | 16.0455 | 29.974 | 29.5357 |
1.7533 | 11.0 | 803 | 1.5834 | 42.3363 | 24.6712 | 40.4291 | 40.0842 |
1.7533 | 12.0 | 876 | 1.5889 | 33.268 | 15.5319 | 30.6942 | 30.4347 |
1.7533 | 13.0 | 949 | 1.5911 | 42.1265 | 23.1983 | 39.5768 | 39.2304 |
1.2341 | 14.0 | 1022 | 1.5926 | 35.0279 | 15.825 | 32.0736 | 32.0093 |
1.2341 | 15.0 | 1095 | 1.5912 | 38.362 | 17.6108 | 35.3148 | 35.0558 |
1.2341 | 16.0 | 1168 | 1.5919 | 38.6117 | 21.3082 | 35.7294 | 35.4192 |
Framework versions
- Transformers 4.24.0
- Pytorch 1.12.1+cu113
- Datasets 2.7.1
- Tokenizers 0.13.2