<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
bart-text-simplification_1e4_adafactor
This model is a fine-tuned version of facebook/bart-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.8377
- Rouge1: 60.5348
- Rouge2: 41.6762
- Rougel: 55.5994
- Rougelsum: 55.5841
- Gen Len: 18.7487
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
---|---|---|---|---|---|---|---|---|
0.1741 | 1.0 | 1163 | 0.6416 | 62.4 | 44.1316 | 57.9029 | 57.8644 | 18.8482 |
0.1553 | 2.0 | 2326 | 0.6504 | 62.2879 | 43.9281 | 57.4714 | 57.461 | 18.8063 |
0.1369 | 3.0 | 3489 | 0.6656 | 61.2481 | 42.605 | 56.5118 | 56.4636 | 18.733 |
0.1286 | 4.0 | 4652 | 0.6906 | 61.3015 | 42.1608 | 56.2688 | 56.1707 | 18.7487 |
0.1141 | 5.0 | 5815 | 0.7082 | 62.1771 | 43.1481 | 57.0231 | 57.0673 | 18.911 |
0.1016 | 6.0 | 6978 | 0.7188 | 61.408 | 42.2759 | 56.1699 | 56.1779 | 18.8377 |
0.0961 | 7.0 | 8141 | 0.7334 | 60.802 | 41.9149 | 56.0171 | 56.0279 | 18.8168 |
0.0869 | 8.0 | 9304 | 0.7509 | 60.6564 | 41.3587 | 55.4436 | 55.468 | 18.7382 |
0.0783 | 9.0 | 10467 | 0.7713 | 60.3551 | 41.8074 | 55.6856 | 55.679 | 18.7173 |
0.0751 | 10.0 | 11630 | 0.7785 | 60.378 | 41.6134 | 55.5217 | 55.505 | 18.8325 |
0.0679 | 11.0 | 12793 | 0.7835 | 60.5835 | 41.6735 | 55.5469 | 55.5791 | 18.7435 |
0.0619 | 12.0 | 13956 | 0.8012 | 60.8152 | 41.2014 | 55.7186 | 55.7233 | 18.9424 |
0.0611 | 13.0 | 15119 | 0.8091 | 60.8188 | 41.8074 | 55.6684 | 55.8026 | 18.7958 |
0.0568 | 14.0 | 16282 | 0.8175 | 60.9209 | 41.5689 | 55.8838 | 55.8642 | 18.7277 |
0.0527 | 15.0 | 17445 | 0.8250 | 61.0215 | 41.9079 | 55.9018 | 55.8709 | 18.9162 |
0.0524 | 16.0 | 18608 | 0.8317 | 60.8214 | 41.6554 | 55.8053 | 55.7947 | 18.7277 |
0.0504 | 17.0 | 19771 | 0.8310 | 60.6533 | 41.6507 | 55.9289 | 55.9426 | 18.7958 |
0.0486 | 18.0 | 20934 | 0.8345 | 60.4722 | 41.5319 | 55.3384 | 55.3655 | 18.6859 |
0.0491 | 19.0 | 22097 | 0.8379 | 60.4012 | 41.2452 | 55.5059 | 55.5553 | 18.8115 |
0.0489 | 20.0 | 23260 | 0.8377 | 60.5348 | 41.6762 | 55.5994 | 55.5841 | 18.7487 |
Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3