generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

text_shortening_model_v6

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Average word count Max word count Min word count Average token count
1.2879 1.0 4 1.7189 0.5385 0.3175 0.4882 0.4875 0.8762 0.886 11.8071 18 5 17.1429
1.1303 2.0 8 1.6107 0.5599 0.337 0.5115 0.5117 0.8853 0.8916 11.2071 18 4 16.3071
1.0984 3.0 12 1.5545 0.5828 0.354 0.5254 0.5252 0.8885 0.8985 11.5286 17 4 16.5714
1.052 4.0 16 1.4943 0.5841 0.3631 0.5384 0.5372 0.8917 0.9004 11.3857 17 5 16.6143
0.9922 5.0 20 1.4517 0.5869 0.3671 0.5437 0.5432 0.8912 0.9011 11.5429 17 5 16.7929
0.9524 6.0 24 1.4308 0.5807 0.3571 0.5332 0.5333 0.8883 0.8994 11.6857 17 5 17.0357
0.9008 7.0 28 1.4152 0.5859 0.3585 0.5333 0.5319 0.8885 0.8974 11.4857 17 5 16.7786
0.8787 8.0 32 1.4089 0.5868 0.3592 0.5366 0.5363 0.8901 0.8991 11.4071 17 5 16.8071
0.857 9.0 36 1.4031 0.5974 0.3747 0.5496 0.5494 0.892 0.9015 11.5214 17 5 16.95
0.8122 10.0 40 1.3961 0.5965 0.3716 0.5487 0.5484 0.8917 0.9031 11.7071 17 6 17.1214
0.7943 11.0 44 1.3922 0.6068 0.3774 0.5572 0.5566 0.8947 0.9058 11.5929 17 6 16.9857
0.7632 12.0 48 1.3949 0.6011 0.371 0.55 0.549 0.8944 0.9039 11.4214 16 5 16.9
0.7464 13.0 52 1.3949 0.6007 0.3757 0.5506 0.5492 0.8938 0.9046 11.4357 16 5 16.8714
0.7235 14.0 56 1.3957 0.6113 0.3814 0.5609 0.5601 0.8965 0.9078 11.5429 16 6 16.8714
0.7293 15.0 60 1.3988 0.6102 0.3809 0.5615 0.56 0.8948 0.9079 11.7 16 6 17.15
0.7188 16.0 64 1.3954 0.6094 0.381 0.5603 0.5588 0.8965 0.9062 11.35 16 6 16.8071
0.7028 17.0 68 1.3969 0.6068 0.3846 0.5581 0.5568 0.896 0.9052 11.2571 16 6 16.65
0.6792 18.0 72 1.4056 0.6007 0.3777 0.5519 0.5508 0.895 0.9048 11.3214 16 6 16.6214
0.671 19.0 76 1.4142 0.6043 0.3779 0.5549 0.5541 0.8954 0.9046 11.2429 15 6 16.5429
0.6644 20.0 80 1.4202 0.6009 0.3767 0.5502 0.5496 0.8955 0.9028 11.1643 16 6 16.3643
0.6526 21.0 84 1.4256 0.6023 0.374 0.5485 0.5485 0.8958 0.9032 11.1857 17 6 16.35
0.6311 22.0 88 1.4356 0.6059 0.3768 0.5492 0.5488 0.8932 0.9042 11.5 17 6 16.7214
0.6448 23.0 92 1.4432 0.6071 0.3768 0.5519 0.5518 0.8935 0.9044 11.5357 17 6 16.7643
0.6344 24.0 96 1.4457 0.6088 0.3823 0.5583 0.5576 0.8985 0.9052 11.1214 16 6 16.3071
0.6299 25.0 100 1.4522 0.6049 0.3709 0.5488 0.5484 0.8976 0.9017 10.9 16 6 15.9643
0.6193 26.0 104 1.4616 0.6045 0.3701 0.5499 0.5495 0.8959 0.9032 11.1714 16 6 16.35
0.6247 27.0 108 1.4704 0.5993 0.3719 0.5515 0.5503 0.8949 0.9041 11.3429 17 7 16.6286
0.6062 28.0 112 1.4760 0.6017 0.3702 0.5537 0.5526 0.8949 0.903 11.2929 17 6 16.5143
0.5921 29.0 116 1.4816 0.5994 0.3734 0.5528 0.552 0.8959 0.9025 11.1429 17 6 16.3429
0.5859 30.0 120 1.4887 0.6027 0.3724 0.5523 0.5518 0.8956 0.9034 11.3357 17 7 16.5143
0.5911 31.0 124 1.4958 0.6065 0.3757 0.5523 0.5519 0.8971 0.9033 11.1857 17 6 16.3643
0.5936 32.0 128 1.5029 0.6008 0.3745 0.5508 0.5508 0.8973 0.9015 10.9714 16 6 16.1
0.584 33.0 132 1.5101 0.6087 0.3801 0.5582 0.5583 0.8969 0.9038 11.2214 16 6 16.4071
0.5741 34.0 136 1.5157 0.6054 0.3814 0.5575 0.5576 0.8961 0.9042 11.2643 16 7 16.4786
0.5793 35.0 140 1.5202 0.6079 0.3866 0.5621 0.5622 0.8968 0.9057 11.3214 16 7 16.5714
0.5803 36.0 144 1.5221 0.6081 0.3824 0.5601 0.5602 0.8966 0.9053 11.3357 16 7 16.6214
0.5719 37.0 148 1.5235 0.6025 0.3802 0.555 0.5542 0.898 0.9035 11.1357 16 7 16.3214
0.5567 38.0 152 1.5238 0.5987 0.3763 0.5524 0.5517 0.8974 0.9024 11.0357 16 7 16.2143
0.5535 39.0 156 1.5264 0.6023 0.3746 0.5547 0.5539 0.8977 0.9035 11.1357 16 7 16.3
0.5507 40.0 160 1.5315 0.6039 0.3757 0.5565 0.5559 0.8979 0.9045 11.2071 16 7 16.4143
0.5568 41.0 164 1.5389 0.6078 0.3819 0.5589 0.5579 0.8973 0.9045 11.4 17 7 16.5571
0.5659 42.0 168 1.5444 0.6037 0.3788 0.5567 0.5558 0.8959 0.9036 11.4286 17 7 16.5714
0.561 43.0 172 1.5475 0.5965 0.372 0.5494 0.548 0.8958 0.9024 11.3357 17 7 16.4929
0.5535 44.0 176 1.5493 0.597 0.3703 0.5495 0.5485 0.8967 0.9025 11.2214 17 7 16.3786
0.5542 45.0 180 1.5507 0.6001 0.3706 0.5529 0.5526 0.897 0.9034 11.2429 17 7 16.4214
0.542 46.0 184 1.5527 0.6001 0.3706 0.5529 0.5526 0.897 0.9034 11.2429 17 7 16.4214
0.5466 47.0 188 1.5539 0.6003 0.3702 0.5529 0.5526 0.8968 0.9033 11.2571 17 7 16.4357
0.5478 48.0 192 1.5550 0.5997 0.3699 0.5515 0.5508 0.8969 0.9029 11.2143 17 7 16.3857
0.5429 49.0 196 1.5552 0.5993 0.3696 0.551 0.5503 0.8968 0.9029 11.2357 17 7 16.4143
0.5443 50.0 200 1.5555 0.5993 0.3696 0.551 0.5503 0.8968 0.9029 11.2357 17 7 16.4143

Framework versions