generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

text_shortening_model_v63

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Bert f1-score Average word count Max word count Min word count Average token count % shortened texts with length > 12
1.8123 1.0 57 1.0664 0.6167 0.412 0.5601 0.5595 0.8975 0.8952 0.8956 7.5908 15 0 11.6752 4.0921
1.1902 2.0 114 0.9103 0.6564 0.4627 0.6141 0.6137 0.9076 0.9085 0.9074 7.4476 17 2 11.578 1.7903
1.0198 3.0 171 0.8631 0.6718 0.481 0.6327 0.6328 0.907 0.9121 0.9089 7.8363 17 2 11.9744 3.8363
0.9149 4.0 228 0.8266 0.674 0.4873 0.6365 0.6361 0.9089 0.9123 0.9101 7.6061 14 2 11.8312 2.5575
0.8256 5.0 285 0.8033 0.68 0.4876 0.643 0.6432 0.9107 0.9121 0.9109 7.5013 14 2 11.6317 2.5575
0.773 6.0 342 0.8007 0.6839 0.4923 0.6546 0.6546 0.9105 0.9141 0.9117 7.6752 16 2 11.9105 2.8133
0.7243 7.0 399 0.7875 0.6775 0.4878 0.6457 0.6465 0.9101 0.9119 0.9104 7.5831 16 2 11.711 2.3018
0.6609 8.0 456 0.7780 0.6833 0.4876 0.6432 0.6432 0.9122 0.9136 0.9124 7.5575 16 2 11.6522 2.3018
0.6402 9.0 513 0.7823 0.6872 0.4871 0.6424 0.6421 0.9107 0.9143 0.912 7.7187 16 2 11.844 3.8363
0.5944 10.0 570 0.7878 0.6795 0.4827 0.6395 0.6387 0.9082 0.913 0.91 7.7161 16 2 11.9591 3.0691
0.5638 11.0 627 0.7889 0.6802 0.4805 0.64 0.6407 0.9111 0.9124 0.9112 7.6368 16 2 11.7749 2.5575
0.5474 12.0 684 0.7987 0.6736 0.4712 0.6295 0.6293 0.9082 0.9121 0.9096 7.6471 16 2 11.8389 2.3018
0.5249 13.0 741 0.7942 0.6859 0.4823 0.6432 0.643 0.9113 0.915 0.9126 7.7545 16 2 11.9488 3.3248
0.4936 14.0 798 0.8077 0.6777 0.4786 0.6369 0.6365 0.9097 0.9122 0.9104 7.6368 16 2 11.7852 2.3018
0.4705 15.0 855 0.8099 0.6809 0.4753 0.6388 0.639 0.9088 0.9123 0.91 7.6624 16 2 11.9156 2.8133
0.4558 16.0 912 0.8154 0.6813 0.4783 0.6389 0.6398 0.9092 0.9135 0.9108 7.7775 16 2 11.9335 3.8363
0.4352 17.0 969 0.8138 0.6897 0.4911 0.6493 0.6496 0.9121 0.9151 0.9131 7.6624 16 2 11.8747 2.8133
0.421 18.0 1026 0.8274 0.6902 0.4868 0.652 0.6517 0.9114 0.9161 0.9132 7.798 16 2 12.0256 3.3248
0.4137 19.0 1083 0.8238 0.6894 0.4902 0.6491 0.6494 0.9118 0.9164 0.9136 7.798 16 2 12.0332 3.0691
0.4026 20.0 1140 0.8385 0.6846 0.4841 0.6428 0.643 0.9098 0.9147 0.9117 7.821 16 2 12.0281 4.3478
0.3866 21.0 1197 0.8393 0.6894 0.4866 0.6469 0.6472 0.9117 0.9166 0.9136 7.8107 16 2 12.0281 4.3478
0.3762 22.0 1254 0.8501 0.691 0.4882 0.6484 0.649 0.9118 0.9175 0.9141 7.8951 16 2 12.133 4.3478
0.3592 23.0 1311 0.8486 0.6906 0.4834 0.6452 0.6458 0.9119 0.9166 0.9137 7.7647 16 2 11.9821 3.3248
0.3532 24.0 1368 0.8530 0.6858 0.4825 0.6425 0.6429 0.9124 0.9157 0.9135 7.7366 16 2 11.9974 3.0691
0.3318 25.0 1425 0.8625 0.6886 0.4867 0.6486 0.6486 0.9111 0.9175 0.9138 7.8414 16 2 12.1765 3.8363
0.3427 26.0 1482 0.8727 0.6879 0.4879 0.6459 0.6464 0.9118 0.9166 0.9137 7.7852 16 2 12.0614 3.3248
0.3245 27.0 1539 0.8885 0.6845 0.4808 0.6381 0.6384 0.9107 0.9152 0.9124 7.7775 16 2 11.9463 3.0691
0.3189 28.0 1596 0.8864 0.6828 0.4769 0.6392 0.6395 0.911 0.9137 0.9119 7.7059 16 2 11.8389 2.5575
0.3069 29.0 1653 0.8970 0.6806 0.4768 0.6374 0.6378 0.91 0.9132 0.9111 7.7289 16 2 11.9437 2.8133
0.3041 30.0 1710 0.8942 0.6802 0.4743 0.6354 0.6361 0.9107 0.9128 0.9113 7.6292 16 2 11.7954 2.8133
0.302 31.0 1767 0.9005 0.6801 0.4785 0.6373 0.6376 0.9095 0.9137 0.9111 7.7698 16 2 11.9923 3.3248
0.2912 32.0 1824 0.9060 0.6806 0.4792 0.6377 0.6374 0.9096 0.913 0.9107 7.6982 16 2 11.9156 3.3248
0.2843 33.0 1881 0.9129 0.6838 0.4801 0.6395 0.6394 0.9101 0.9142 0.9116 7.757 16 2 11.9079 4.3478
0.2833 34.0 1938 0.9175 0.6861 0.4846 0.6408 0.6413 0.9106 0.9142 0.9118 7.7494 16 2 11.9309 3.8363
0.2751 35.0 1995 0.9189 0.6886 0.4831 0.6442 0.6447 0.9121 0.9149 0.913 7.665 16 2 11.9028 2.5575
0.2713 36.0 2052 0.9234 0.6868 0.4882 0.6439 0.6437 0.9114 0.9155 0.9129 7.7903 16 2 12.023 2.8133
0.2587 37.0 2109 0.9345 0.6813 0.4829 0.6387 0.638 0.9102 0.914 0.9115 7.7673 16 2 11.9514 3.5806
0.2646 38.0 2166 0.9315 0.6841 0.4829 0.6387 0.6386 0.9106 0.9135 0.9115 7.7161 16 2 11.9182 3.5806
0.2583 39.0 2223 0.9359 0.6833 0.4799 0.6375 0.6379 0.9104 0.9137 0.9115 7.757 16 2 11.9591 2.5575
0.2518 40.0 2280 0.9392 0.6877 0.4851 0.6395 0.6403 0.9107 0.9141 0.9118 7.798 16 2 12.0051 3.3248
0.2453 41.0 2337 0.9420 0.6885 0.4835 0.6405 0.6412 0.9109 0.9141 0.912 7.7494 16 2 11.954 3.5806
0.251 42.0 2394 0.9427 0.6852 0.4798 0.636 0.6367 0.9108 0.9136 0.9116 7.7647 16 2 11.9488 3.5806
0.2495 43.0 2451 0.9445 0.6821 0.4792 0.6342 0.6351 0.9099 0.913 0.9109 7.7596 16 2 11.9565 3.5806
0.248 44.0 2508 0.9448 0.681 0.4782 0.6336 0.6342 0.9091 0.9132 0.9106 7.7928 16 2 12.0179 3.3248
0.2516 45.0 2565 0.9472 0.6839 0.4852 0.6387 0.6388 0.91 0.914 0.9114 7.8465 16 2 12.0537 3.8363
0.2475 46.0 2622 0.9523 0.6812 0.4814 0.6357 0.6361 0.909 0.9137 0.9108 7.867 16 2 12.0972 3.8363
0.241 47.0 2679 0.9518 0.6801 0.4809 0.6337 0.6338 0.909 0.9132 0.9106 7.8286 16 2 12.046 3.8363
0.2386 48.0 2736 0.9519 0.6801 0.4783 0.633 0.6332 0.9084 0.9129 0.9101 7.8363 16 2 12.0537 4.0921
0.2398 49.0 2793 0.9521 0.6816 0.48 0.6349 0.635 0.9093 0.9132 0.9107 7.7775 16 2 11.9847 3.3248
0.2323 50.0 2850 0.9523 0.6828 0.4806 0.6361 0.6363 0.9094 0.9135 0.9109 7.798 16 2 12.0077 3.3248

Framework versions