generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

text_shortening_model_v23

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Average word count Max word count Min word count Average token count % shortened texts with length > 12
2.2171 1.0 100 1.7694 0.514 0.2977 0.4697 0.4699 0.8711 0.8789 10.7598 17 3 15.5764 29.6943
1.8398 2.0 200 1.6351 0.5161 0.3041 0.4676 0.4683 0.8737 0.8815 10.655 17 4 15.5284 26.6376
1.6309 3.0 300 1.5497 0.5277 0.3192 0.4741 0.4749 0.8799 0.8836 10.179 17 6 15.0218 20.9607
1.5015 4.0 400 1.4933 0.5295 0.3196 0.4783 0.4787 0.8768 0.8838 10.5371 17 6 15.393 24.4541
1.3868 5.0 500 1.4555 0.5311 0.3235 0.4721 0.4726 0.8776 0.8839 10.5022 17 5 15.4323 24.4541
1.3147 6.0 600 1.4297 0.5312 0.3234 0.476 0.4768 0.8796 0.8826 10.0917 17 5 14.9563 19.214
1.2207 7.0 700 1.4147 0.5256 0.315 0.4747 0.4753 0.877 0.8837 10.4148 17 5 15.3537 23.1441
1.1465 8.0 800 1.3993 0.521 0.3112 0.4691 0.4698 0.8784 0.8819 10.179 17 5 15.0262 18.7773
1.1006 9.0 900 1.3868 0.5235 0.3122 0.4701 0.4707 0.8766 0.8819 10.3231 17 5 15.179 21.3974
1.0469 10.0 1000 1.3790 0.5174 0.3028 0.4644 0.4647 0.877 0.8811 10.1266 17 5 15.0306 17.0306
0.978 11.0 1100 1.3848 0.5226 0.3015 0.4697 0.4704 0.8779 0.8818 10.1528 17 5 15.1397 16.5939
0.9379 12.0 1200 1.3937 0.5129 0.2966 0.457 0.4575 0.8772 0.88 10.1048 17 6 14.9301 18.3406
0.8987 13.0 1300 1.3858 0.5203 0.3057 0.4673 0.4679 0.8798 0.8812 9.9738 17 5 14.8472 14.8472
0.8455 14.0 1400 1.3936 0.519 0.3028 0.4636 0.4639 0.8788 0.88 9.9476 17 5 14.8734 17.0306
0.8106 15.0 1500 1.3965 0.5293 0.3145 0.4771 0.4778 0.8819 0.8828 9.7773 17 5 14.6376 14.4105
0.7857 16.0 1600 1.4079 0.5239 0.3105 0.4698 0.4702 0.8792 0.8807 9.9127 17 5 14.8166 16.5939
0.7661 17.0 1700 1.4106 0.5192 0.3058 0.4657 0.4663 0.8787 0.8797 9.9214 17 5 14.6856 17.4672
0.7239 18.0 1800 1.4206 0.5226 0.307 0.4683 0.469 0.8797 0.8813 9.8646 17 5 14.8297 14.4105
0.7021 19.0 1900 1.4213 0.5183 0.3052 0.467 0.4669 0.8801 0.8796 9.6943 17 5 14.5066 11.7904
0.6752 20.0 2000 1.4283 0.5263 0.3102 0.4767 0.4777 0.8819 0.8815 9.6638 17 5 14.5415 11.7904
0.6642 21.0 2100 1.4261 0.5286 0.3132 0.4746 0.4753 0.8818 0.8808 9.607 17 5 14.4148 10.0437
0.6319 22.0 2200 1.4426 0.5343 0.315 0.4763 0.4765 0.8809 0.8819 10.0 17 5 14.821 16.1572
0.6149 23.0 2300 1.4537 0.5334 0.3182 0.4808 0.4807 0.8821 0.8811 9.6943 17 5 14.5066 13.5371
0.6063 24.0 2400 1.4483 0.528 0.3117 0.4712 0.4719 0.8808 0.8816 9.8035 17 5 14.607 15.2838
0.57 25.0 2500 1.4770 0.5234 0.3059 0.4644 0.4647 0.8814 0.8799 9.6288 17 5 14.3755 13.9738
0.5585 26.0 2600 1.4928 0.5232 0.3059 0.47 0.47 0.8795 0.8812 9.8865 17 5 14.6638 14.4105
0.5568 27.0 2700 1.4829 0.529 0.3059 0.4703 0.4704 0.8811 0.881 9.7773 17 5 14.5459 14.4105
0.5404 28.0 2800 1.5009 0.5196 0.3028 0.4664 0.4666 0.8788 0.8789 9.7598 15 5 14.6419 13.9738
0.5253 29.0 2900 1.5142 0.5168 0.2952 0.4614 0.4617 0.8797 0.8778 9.5502 15 5 14.262 12.2271
0.5176 30.0 3000 1.5150 0.523 0.3035 0.4658 0.4659 0.8788 0.881 10.0393 17 5 14.7904 19.214
0.5002 31.0 3100 1.5348 0.5291 0.3074 0.471 0.4713 0.8791 0.882 10.0262 17 5 14.8559 19.214
0.4944 32.0 3200 1.5343 0.5183 0.3028 0.4674 0.468 0.8798 0.8791 9.69 17 5 14.4279 13.9738
0.493 33.0 3300 1.5319 0.5245 0.3027 0.4685 0.4686 0.88 0.8803 9.7948 17 5 14.6594 14.4105
0.4617 34.0 3400 1.5453 0.5258 0.3052 0.4685 0.4691 0.8807 0.8815 9.7598 17 5 14.6026 13.1004
0.4642 35.0 3500 1.5520 0.532 0.3119 0.478 0.4785 0.8821 0.8825 9.8035 17 5 14.6157 15.2838
0.4559 36.0 3600 1.5570 0.5239 0.3109 0.4694 0.4703 0.8801 0.8815 9.8079 17 5 14.7205 13.9738
0.4435 37.0 3700 1.5606 0.5222 0.3058 0.4666 0.467 0.8792 0.8799 9.7729 17 5 14.6288 14.4105
0.4423 38.0 3800 1.5744 0.524 0.3089 0.4682 0.4687 0.881 0.88 9.7162 15 5 14.4803 13.9738
0.4399 39.0 3900 1.5732 0.5245 0.3127 0.4718 0.4721 0.8802 0.881 9.7729 15 5 14.6681 13.9738
0.4265 40.0 4000 1.5692 0.5306 0.3192 0.4784 0.4789 0.8831 0.8816 9.607 15 5 14.4061 11.7904
0.435 41.0 4100 1.5752 0.526 0.31 0.4734 0.474 0.8819 0.8803 9.6245 15 5 14.476 12.6638
0.414 42.0 4200 1.5803 0.5249 0.3091 0.4707 0.47 0.8813 0.8795 9.5939 15 5 14.4061 12.6638
0.4161 43.0 4300 1.5888 0.5237 0.3045 0.4685 0.4676 0.8808 0.8799 9.6638 15 5 14.5153 12.2271
0.3968 44.0 4400 1.5946 0.5214 0.3049 0.4677 0.4676 0.8801 0.8803 9.7511 15 5 14.6376 13.1004
0.405 45.0 4500 1.5967 0.5234 0.3066 0.4692 0.4692 0.8808 0.8808 9.7598 15 5 14.6026 13.1004
0.4063 46.0 4600 1.5984 0.5238 0.3077 0.47 0.4703 0.8807 0.8809 9.8297 15 5 14.7031 15.2838
0.4006 47.0 4700 1.5971 0.5231 0.3082 0.4702 0.4697 0.8807 0.8804 9.7118 15 5 14.607 13.9738
0.4045 48.0 4800 1.5988 0.5232 0.3054 0.4707 0.4707 0.881 0.8803 9.6812 15 5 14.5721 13.5371
0.397 49.0 4900 1.5991 0.5244 0.3068 0.471 0.4711 0.8806 0.8799 9.7031 15 5 14.5983 13.5371
0.3963 50.0 5000 1.5992 0.5244 0.3068 0.4711 0.4712 0.8806 0.8799 9.7031 15 5 14.5895 13.5371

Framework versions