generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

text_shortening_model_v57

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Bert f1-score Average word count Max word count Min word count Average token count % shortened texts with length > 12
2.6374 1.0 49 2.0882 0.3209 0.1826 0.2886 0.2875 0.7596 0.789 0.7725 9.6562 17 0 15.9018 30.3571
2.1652 2.0 98 1.7029 0.3216 0.1848 0.2939 0.2932 0.7531 0.7906 0.77 9.7054 17 0 16.0402 25.0
1.8731 3.0 147 1.4874 0.4234 0.277 0.3972 0.3951 0.8117 0.8225 0.8158 8.7455 17 0 14.567 21.4286
1.6683 4.0 196 1.3346 0.5198 0.3525 0.4889 0.4888 0.8491 0.8338 0.8405 7.4598 17 0 12.7188 13.3929
1.5287 5.0 245 1.2321 0.5535 0.377 0.5101 0.5109 0.8713 0.8588 0.8643 7.7143 17 0 12.6384 12.0536
1.4303 6.0 294 1.1638 0.5773 0.3941 0.529 0.529 0.8848 0.8703 0.8769 7.5 17 0 12.2009 9.375
1.3696 7.0 343 1.1091 0.5945 0.4125 0.5426 0.5421 0.8946 0.8814 0.8874 7.6473 16 0 12.1875 9.375
1.305 8.0 392 1.0707 0.597 0.4082 0.5434 0.5432 0.8959 0.883 0.8889 7.6741 16 0 12.2143 8.0357
1.2575 9.0 441 1.0384 0.6094 0.4193 0.5521 0.553 0.8993 0.8868 0.8925 7.6786 17 0 12.125 7.1429
1.241 10.0 490 1.0125 0.617 0.423 0.5595 0.5601 0.9038 0.8918 0.8973 7.817 17 2 12.1027 7.5893
1.17 11.0 539 0.9912 0.6173 0.4236 0.5593 0.5591 0.9052 0.892 0.8981 7.7455 17 2 11.9911 6.25
1.1413 12.0 588 0.9750 0.6253 0.4322 0.5661 0.5663 0.9049 0.8935 0.8986 7.9286 17 2 12.1786 7.1429
1.1367 13.0 637 0.9586 0.63 0.4356 0.5704 0.5704 0.9068 0.8943 0.9 7.8705 17 2 12.0357 6.6964
1.1101 14.0 686 0.9458 0.6273 0.4355 0.5665 0.5671 0.9057 0.8949 0.8998 7.942 17 2 12.1384 6.6964
1.0711 15.0 735 0.9374 0.6357 0.4424 0.5718 0.5721 0.9068 0.8969 0.9013 8.0179 17 2 12.1875 6.6964
1.0553 16.0 784 0.9282 0.6378 0.4455 0.5752 0.5756 0.9084 0.8969 0.9022 7.8571 16 2 12.0536 4.9107
1.047 17.0 833 0.9188 0.6439 0.4525 0.5821 0.5825 0.9085 0.8996 0.9035 7.9955 16 2 12.1741 5.3571
1.0201 18.0 882 0.9104 0.643 0.4536 0.5832 0.5836 0.9083 0.8997 0.9035 8.0134 16 2 12.2098 5.8036
1.0228 19.0 931 0.9023 0.6471 0.4601 0.5862 0.5865 0.9101 0.902 0.9056 8.0 16 2 12.1652 4.9107
0.9896 20.0 980 0.8936 0.6497 0.463 0.5882 0.5888 0.9103 0.9017 0.9055 8.0491 16 2 12.1741 5.3571
0.9815 21.0 1029 0.8873 0.6555 0.4659 0.5937 0.5948 0.9106 0.9025 0.9061 8.0402 16 2 12.2411 5.8036
0.9877 22.0 1078 0.8828 0.6618 0.4728 0.6005 0.6007 0.9125 0.9047 0.9081 8.1205 16 2 12.308 6.25
0.9696 23.0 1127 0.8774 0.661 0.4679 0.6 0.5994 0.9128 0.9046 0.9082 8.0938 16 3 12.2902 6.25
0.9556 24.0 1176 0.8737 0.6613 0.4717 0.6023 0.6022 0.913 0.9052 0.9086 8.0893 16 3 12.3036 6.25
0.95 25.0 1225 0.8703 0.6636 0.4725 0.6044 0.6041 0.913 0.9055 0.9088 8.1384 16 3 12.3616 6.25
0.9464 26.0 1274 0.8660 0.6629 0.4723 0.6057 0.6052 0.9125 0.9053 0.9085 8.1562 16 3 12.3482 6.25
0.9189 27.0 1323 0.8605 0.6633 0.4746 0.6084 0.6079 0.9124 0.9052 0.9083 8.0848 16 3 12.2723 6.25
0.9277 28.0 1372 0.8583 0.662 0.4731 0.6059 0.6057 0.9118 0.9059 0.9084 8.1607 16 3 12.3304 6.25
0.9142 29.0 1421 0.8550 0.6663 0.4784 0.6106 0.6104 0.9126 0.9073 0.9095 8.1786 16 4 12.3482 6.6964
0.913 30.0 1470 0.8529 0.6656 0.477 0.6073 0.607 0.9123 0.9073 0.9093 8.2589 16 4 12.4241 7.1429
0.8984 31.0 1519 0.8507 0.6708 0.4804 0.6114 0.6116 0.9128 0.9083 0.9101 8.2098 16 4 12.3973 6.6964
0.903 32.0 1568 0.8479 0.6728 0.4777 0.6096 0.6096 0.9133 0.9081 0.9103 8.2232 16 4 12.3973 7.1429
0.8947 33.0 1617 0.8452 0.6741 0.4785 0.6119 0.6116 0.9132 0.9081 0.9101 8.1741 16 4 12.3616 7.1429
0.8883 34.0 1666 0.8424 0.6733 0.4766 0.6108 0.6107 0.9125 0.9072 0.9094 8.1607 16 4 12.3438 7.1429
0.877 35.0 1715 0.8403 0.6742 0.4799 0.6141 0.6145 0.9133 0.908 0.9102 8.1429 16 4 12.3304 6.6964
0.8612 36.0 1764 0.8393 0.6737 0.4808 0.6141 0.6143 0.9133 0.908 0.9102 8.1384 16 4 12.3259 6.6964
0.8848 37.0 1813 0.8363 0.673 0.478 0.6131 0.6133 0.9124 0.9074 0.9095 8.1384 16 4 12.3214 6.6964
0.8717 38.0 1862 0.8363 0.6729 0.478 0.613 0.6132 0.9129 0.9075 0.9097 8.0848 16 4 12.2545 5.8036
0.8739 39.0 1911 0.8355 0.6711 0.4775 0.6115 0.6118 0.913 0.9072 0.9096 8.0714 16 4 12.2366 5.8036
0.8569 40.0 1960 0.8343 0.672 0.4772 0.6125 0.6128 0.9132 0.9074 0.9098 8.0804 16 4 12.2366 5.8036
0.8601 41.0 2009 0.8342 0.675 0.4831 0.6163 0.6165 0.9139 0.9081 0.9105 8.0982 16 4 12.2634 5.8036
0.8519 42.0 2058 0.8330 0.6743 0.481 0.6147 0.6152 0.9137 0.908 0.9104 8.1027 16 4 12.2723 5.8036
0.8713 43.0 2107 0.8322 0.6757 0.4844 0.617 0.6172 0.9133 0.9079 0.9102 8.125 16 4 12.2857 6.25
0.8554 44.0 2156 0.8313 0.6746 0.4809 0.6151 0.6154 0.9132 0.9079 0.9101 8.1384 16 4 12.2857 6.25
0.8559 45.0 2205 0.8314 0.6773 0.4849 0.6184 0.6189 0.9137 0.9085 0.9106 8.125 16 4 12.2857 6.25
0.847 46.0 2254 0.8312 0.6767 0.4829 0.6175 0.6176 0.9136 0.9084 0.9105 8.125 16 4 12.2857 6.25
0.8588 47.0 2303 0.8306 0.6754 0.4814 0.6163 0.6163 0.9131 0.9082 0.9102 8.1473 16 4 12.3214 6.6964
0.8484 48.0 2352 0.8304 0.676 0.4816 0.6172 0.6179 0.9135 0.9082 0.9104 8.1161 16 4 12.2946 6.25
0.8514 49.0 2401 0.8303 0.676 0.4816 0.6172 0.6179 0.9135 0.9082 0.9104 8.1161 16 4 12.2946 6.25
0.8562 50.0 2450 0.8302 0.676 0.4816 0.6172 0.6179 0.9135 0.9082 0.9104 8.1161 16 4 12.2946 6.25

Framework versions