generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

t5-small-finetuned-text2log-finetuned-nl-to-fol-finetuned-nl-to-fol-finetuned-nl-to-fol-version2

This model is a fine-tuned version of anki08/t5-small-finetuned-text2log-finetuned-nl-to-fol-finetuned-nl-to-fol on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 22 0.0692 27.4908 18.7353
No log 2.0 44 0.0631 27.554 18.7294
No log 3.0 66 0.0533 27.6007 18.7294
No log 4.0 88 0.0484 27.6446 18.7294
No log 5.0 110 0.0439 27.6401 18.7294
No log 6.0 132 0.0404 27.5117 18.7294
No log 7.0 154 0.0389 27.6358 18.7294
No log 8.0 176 0.0362 27.6358 18.7294
No log 9.0 198 0.0339 27.5731 18.7294
No log 10.0 220 0.0319 27.2326 18.6882
No log 11.0 242 0.0298 27.2326 18.6882
No log 12.0 264 0.0293 27.5498 18.7294
No log 13.0 286 0.0276 27.6566 18.7294
No log 14.0 308 0.0268 27.6566 18.7294
No log 15.0 330 0.0251 27.6107 18.7294
No log 16.0 352 0.0239 27.7096 18.7294
No log 17.0 374 0.0228 27.6716 18.7294
No log 18.0 396 0.0231 27.8083 18.7294
No log 19.0 418 0.0218 27.4838 18.6882
No log 20.0 440 0.0212 27.4712 18.6882
No log 21.0 462 0.0197 27.8787 18.7353
No log 22.0 484 0.0207 27.6899 18.6941
0.1026 23.0 506 0.0186 27.6376 18.6941
0.1026 24.0 528 0.0202 27.6672 18.6941
0.1026 25.0 550 0.0174 28.0172 18.7412
0.1026 26.0 572 0.0170 27.8714 18.7412
0.1026 27.0 594 0.0164 27.7423 18.7412
0.1026 28.0 616 0.0164 27.8278 18.7412
0.1026 29.0 638 0.0163 27.8278 18.7412
0.1026 30.0 660 0.0158 27.907 18.7412
0.1026 31.0 682 0.0165 27.7752 18.7412
0.1026 32.0 704 0.0147 27.8284 18.7412
0.1026 33.0 726 0.0150 27.8862 18.7412
0.1026 34.0 748 0.0148 27.8402 18.7412
0.1026 35.0 770 0.0141 27.8353 18.7412
0.1026 36.0 792 0.0142 27.858 18.7412
0.1026 37.0 814 0.0143 27.858 18.7412
0.1026 38.0 836 0.0158 27.8353 18.7412
0.1026 39.0 858 0.0125 27.8913 18.7412
0.1026 40.0 880 0.0121 27.9167 18.7412
0.1026 41.0 902 0.0122 27.9569 18.7412
0.1026 42.0 924 0.0126 27.9569 18.7412
0.1026 43.0 946 0.0120 28.001 18.7412
0.1026 44.0 968 0.0125 28.0079 18.7412
0.1026 45.0 990 0.0115 28.0079 18.7412
0.072 46.0 1012 0.0113 27.9851 18.7412
0.072 47.0 1034 0.0113 28.0184 18.7412
0.072 48.0 1056 0.0110 28.0184 18.7412
0.072 49.0 1078 0.0108 28.0184 18.7412
0.072 50.0 1100 0.0107 28.0184 18.7412
0.072 51.0 1122 0.0101 28.0184 18.7412
0.072 52.0 1144 0.0102 28.0184 18.7412
0.072 53.0 1166 0.0099 28.0184 18.7412
0.072 54.0 1188 0.0100 28.0184 18.7412
0.072 55.0 1210 0.0102 28.0184 18.7412
0.072 56.0 1232 0.0095 28.0184 18.7412
0.072 57.0 1254 0.0098 28.0184 18.7412
0.072 58.0 1276 0.0092 28.0184 18.7412
0.072 59.0 1298 0.0090 28.0184 18.7412
0.072 60.0 1320 0.0095 28.0184 18.7412
0.072 61.0 1342 0.0092 27.9674 18.7412
0.072 62.0 1364 0.0091 27.9419 18.7412
0.072 63.0 1386 0.0100 27.9419 18.7412
0.072 64.0 1408 0.0084 28.0752 18.7412
0.072 65.0 1430 0.0086 28.0192 18.7412
0.072 66.0 1452 0.0084 28.0192 18.7412
0.072 67.0 1474 0.0085 28.0192 18.7412
0.072 68.0 1496 0.0087 28.0192 18.7412
0.0575 69.0 1518 0.0084 28.0192 18.7412
0.0575 70.0 1540 0.0080 28.0192 18.7412
0.0575 71.0 1562 0.0082 28.0192 18.7412
0.0575 72.0 1584 0.0080 28.0192 18.7412
0.0575 73.0 1606 0.0075 28.0192 18.7412
0.0575 74.0 1628 0.0079 28.0192 18.7412
0.0575 75.0 1650 0.0078 28.0752 18.7412
0.0575 76.0 1672 0.0076 28.1311 18.7412
0.0575 77.0 1694 0.0073 28.1311 18.7412
0.0575 78.0 1716 0.0074 28.1311 18.7412
0.0575 79.0 1738 0.0072 28.1311 18.7412
0.0575 80.0 1760 0.0078 28.1311 18.7412
0.0575 81.0 1782 0.0077 28.1311 18.7412
0.0575 82.0 1804 0.0071 28.1311 18.7412
0.0575 83.0 1826 0.0072 28.1311 18.7412
0.0575 84.0 1848 0.0075 28.1311 18.7412
0.0575 85.0 1870 0.0071 28.1311 18.7412
0.0575 86.0 1892 0.0070 28.1311 18.7412
0.0575 87.0 1914 0.0069 28.1311 18.7412
0.0575 88.0 1936 0.0069 28.1311 18.7412
0.0575 89.0 1958 0.0069 28.1311 18.7412
0.0575 90.0 1980 0.0069 28.1311 18.7412
0.0509 91.0 2002 0.0069 28.1311 18.7412
0.0509 92.0 2024 0.0070 28.1311 18.7412
0.0509 93.0 2046 0.0069 28.1311 18.7412
0.0509 94.0 2068 0.0070 28.1311 18.7412
0.0509 95.0 2090 0.0069 28.1311 18.7412
0.0509 96.0 2112 0.0069 28.1311 18.7412
0.0509 97.0 2134 0.0069 28.1311 18.7412
0.0509 98.0 2156 0.0069 28.1311 18.7412
0.0509 99.0 2178 0.0069 28.1311 18.7412
0.0509 100.0 2200 0.0069 28.1311 18.7412

Framework versions