<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
t5-base-finetuned-es-to-pua
This model is a fine-tuned version of t5-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.4986
- Bleu: 1.7461
- Gen Len: 15.8171
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
---|---|---|---|---|---|
No log | 1.0 | 36 | 3.4870 | 0.0863 | 17.9878 |
No log | 2.0 | 72 | 3.0772 | 0.2333 | 17.622 |
No log | 3.0 | 108 | 2.7865 | 0.2752 | 16.6829 |
No log | 4.0 | 144 | 2.5878 | 0.8782 | 17.9024 |
No log | 5.0 | 180 | 2.4639 | 1.584 | 17.1463 |
No log | 6.0 | 216 | 2.3559 | 0.9321 | 16.8049 |
No log | 7.0 | 252 | 2.2704 | 1.0018 | 17.3902 |
No log | 8.0 | 288 | 2.1956 | 1.2549 | 17.0732 |
No log | 9.0 | 324 | 2.1307 | 0.9709 | 17.4268 |
No log | 10.0 | 360 | 2.0866 | 0.7563 | 17.5 |
No log | 11.0 | 396 | 2.0392 | 0.707 | 17.2439 |
No log | 12.0 | 432 | 1.9920 | 0.8647 | 16.9756 |
No log | 13.0 | 468 | 1.9630 | 0.8724 | 17.8171 |
2.7137 | 14.0 | 504 | 1.9244 | 1.0593 | 17.4146 |
2.7137 | 15.0 | 540 | 1.9010 | 1.6823 | 17.061 |
2.7137 | 16.0 | 576 | 1.8711 | 1.6452 | 16.5732 |
2.7137 | 17.0 | 612 | 1.8475 | 1.6622 | 16.8659 |
2.7137 | 18.0 | 648 | 1.8265 | 2.2968 | 16.7195 |
2.7137 | 19.0 | 684 | 1.8056 | 2.2125 | 16.6098 |
2.7137 | 20.0 | 720 | 1.7962 | 2.3889 | 16.3049 |
2.7137 | 21.0 | 756 | 1.7778 | 2.341 | 16.3537 |
2.7137 | 22.0 | 792 | 1.7626 | 2.3187 | 16.1341 |
2.7137 | 23.0 | 828 | 1.7450 | 2.5281 | 16.0732 |
2.7137 | 24.0 | 864 | 1.7357 | 2.6768 | 15.9268 |
2.7137 | 25.0 | 900 | 1.7177 | 2.3932 | 15.9146 |
2.7137 | 26.0 | 936 | 1.7126 | 2.611 | 15.8537 |
2.7137 | 27.0 | 972 | 1.7088 | 2.2829 | 15.622 |
1.9301 | 28.0 | 1008 | 1.6868 | 2.4441 | 15.8293 |
1.9301 | 29.0 | 1044 | 1.6707 | 2.5402 | 16.0976 |
1.9301 | 30.0 | 1080 | 1.6790 | 2.0723 | 15.561 |
1.9301 | 31.0 | 1116 | 1.6600 | 1.4278 | 15.9146 |
1.9301 | 32.0 | 1152 | 1.6661 | 1.4274 | 15.7317 |
1.9301 | 33.0 | 1188 | 1.6474 | 1.4484 | 15.6463 |
1.9301 | 34.0 | 1224 | 1.6484 | 1.5172 | 15.7805 |
1.9301 | 35.0 | 1260 | 1.6389 | 1.5497 | 15.7561 |
1.9301 | 36.0 | 1296 | 1.6384 | 1.52 | 15.6341 |
1.9301 | 37.0 | 1332 | 1.6304 | 1.4572 | 15.8293 |
1.9301 | 38.0 | 1368 | 1.6163 | 1.4786 | 16.1341 |
1.9301 | 39.0 | 1404 | 1.6116 | 1.5765 | 15.9634 |
1.9301 | 40.0 | 1440 | 1.6020 | 1.5902 | 16.0244 |
1.9301 | 41.0 | 1476 | 1.6064 | 1.6992 | 15.8659 |
1.6368 | 42.0 | 1512 | 1.5949 | 1.5409 | 16.0 |
1.6368 | 43.0 | 1548 | 1.5811 | 1.4916 | 16.2439 |
1.6368 | 44.0 | 1584 | 1.5849 | 1.6047 | 16.2683 |
1.6368 | 45.0 | 1620 | 1.5843 | 1.521 | 15.7073 |
1.6368 | 46.0 | 1656 | 1.5805 | 1.7424 | 15.9878 |
1.6368 | 47.0 | 1692 | 1.5791 | 1.6066 | 15.9268 |
1.6368 | 48.0 | 1728 | 1.5734 | 1.602 | 15.7195 |
1.6368 | 49.0 | 1764 | 1.5649 | 1.5817 | 15.939 |
1.6368 | 50.0 | 1800 | 1.5654 | 1.6469 | 15.8293 |
1.6368 | 51.0 | 1836 | 1.5587 | 1.7048 | 15.6463 |
1.6368 | 52.0 | 1872 | 1.5553 | 1.5203 | 15.8415 |
1.6368 | 53.0 | 1908 | 1.5500 | 1.5646 | 15.6951 |
1.6368 | 54.0 | 1944 | 1.5532 | 1.5003 | 15.7195 |
1.6368 | 55.0 | 1980 | 1.5344 | 1.5359 | 16.1098 |
1.4554 | 56.0 | 2016 | 1.5370 | 1.6052 | 15.6951 |
1.4554 | 57.0 | 2052 | 1.5394 | 1.5299 | 15.9146 |
1.4554 | 58.0 | 2088 | 1.5399 | 1.6024 | 15.6829 |
1.4554 | 59.0 | 2124 | 1.5403 | 1.6342 | 15.6829 |
1.4554 | 60.0 | 2160 | 1.5361 | 1.609 | 15.7195 |
1.4554 | 61.0 | 2196 | 1.5308 | 1.6753 | 15.878 |
1.4554 | 62.0 | 2232 | 1.5211 | 1.6381 | 16.0976 |
1.4554 | 63.0 | 2268 | 1.5242 | 1.7172 | 15.622 |
1.4554 | 64.0 | 2304 | 1.5215 | 1.6888 | 15.9024 |
1.4554 | 65.0 | 2340 | 1.5146 | 1.6619 | 16.0 |
1.4554 | 66.0 | 2376 | 1.5173 | 1.7203 | 15.8537 |
1.4554 | 67.0 | 2412 | 1.5235 | 1.7363 | 15.7317 |
1.4554 | 68.0 | 2448 | 1.5125 | 1.7295 | 16.0366 |
1.4554 | 69.0 | 2484 | 1.5141 | 1.7005 | 15.8902 |
1.3341 | 70.0 | 2520 | 1.5162 | 1.8302 | 15.7927 |
1.3341 | 71.0 | 2556 | 1.5129 | 1.8278 | 15.9024 |
1.3341 | 72.0 | 2592 | 1.5123 | 1.7764 | 15.6829 |
1.3341 | 73.0 | 2628 | 1.5046 | 1.7259 | 15.9634 |
1.3341 | 74.0 | 2664 | 1.5069 | 1.6517 | 15.9024 |
1.3341 | 75.0 | 2700 | 1.5026 | 1.7334 | 15.9024 |
1.3341 | 76.0 | 2736 | 1.4923 | 1.7531 | 15.9268 |
1.3341 | 77.0 | 2772 | 1.4956 | 1.7338 | 15.7561 |
1.3341 | 78.0 | 2808 | 1.4996 | 1.6956 | 15.7805 |
1.3341 | 79.0 | 2844 | 1.5010 | 1.7299 | 15.9268 |
1.3341 | 80.0 | 2880 | 1.5012 | 1.7097 | 15.9024 |
1.3341 | 81.0 | 2916 | 1.5032 | 1.7689 | 15.8902 |
1.3341 | 82.0 | 2952 | 1.5025 | 1.7353 | 15.939 |
1.3341 | 83.0 | 2988 | 1.5004 | 1.7472 | 15.9512 |
1.2568 | 84.0 | 3024 | 1.4989 | 1.7171 | 15.9756 |
1.2568 | 85.0 | 3060 | 1.5015 | 1.7704 | 15.9024 |
1.2568 | 86.0 | 3096 | 1.5017 | 1.7838 | 15.9024 |
1.2568 | 87.0 | 3132 | 1.5022 | 1.7562 | 16.0366 |
1.2568 | 88.0 | 3168 | 1.5004 | 1.7633 | 16.0366 |
1.2568 | 89.0 | 3204 | 1.4995 | 1.7633 | 15.9756 |
1.2568 | 90.0 | 3240 | 1.5038 | 1.766 | 15.8537 |
1.2568 | 91.0 | 3276 | 1.5001 | 1.7764 | 16.0 |
1.2568 | 92.0 | 3312 | 1.5010 | 1.7707 | 15.878 |
1.2568 | 93.0 | 3348 | 1.4996 | 1.7633 | 15.9268 |
1.2568 | 94.0 | 3384 | 1.5011 | 1.7453 | 15.8171 |
1.2568 | 95.0 | 3420 | 1.5014 | 1.7385 | 15.7927 |
1.2568 | 96.0 | 3456 | 1.4996 | 1.7253 | 15.7927 |
1.2568 | 97.0 | 3492 | 1.4988 | 1.7459 | 15.8049 |
1.2103 | 98.0 | 3528 | 1.4978 | 1.7461 | 15.8171 |
1.2103 | 99.0 | 3564 | 1.4986 | 1.7461 | 15.8293 |
1.2103 | 100.0 | 3600 | 1.4986 | 1.7461 | 15.8171 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.0
- Datasets 2.10.1
- Tokenizers 0.13.2