generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

text_shortening_model_v58

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Bert f1-score Average word count Max word count Min word count Average token count % shortened texts with length > 12
1.4028 1.0 49 1.9552 0.313 0.162 0.275 0.2756 0.7453 0.7745 0.7581 9.7455 18 0 16.3661 25.4464
1.1908 2.0 98 1.6121 0.2964 0.1484 0.2626 0.2629 0.7359 0.78 0.7563 9.9464 18 0 17.0268 21.875
1.0226 3.0 147 1.3961 0.4308 0.2552 0.388 0.3874 0.8166 0.8275 0.8206 9.1339 18 0 14.9375 23.6607
0.9468 4.0 196 1.2672 0.5074 0.3277 0.4695 0.4688 0.8525 0.8443 0.8474 8.3482 18 0 13.5938 15.625
0.9222 5.0 245 1.1779 0.5567 0.3752 0.5141 0.5136 0.8764 0.8648 0.8698 8.192 18 0 12.9643 12.0536
0.9229 6.0 294 1.1157 0.5911 0.4085 0.5432 0.5422 0.8855 0.8739 0.8791 8.0089 18 0 12.75 7.5893
0.8773 7.0 343 1.0715 0.6099 0.4151 0.5546 0.5537 0.8969 0.8844 0.8901 7.9464 18 0 12.5312 8.4821
0.8911 8.0 392 1.0372 0.62 0.4184 0.5617 0.5606 0.9016 0.8905 0.8956 8.1652 18 3 12.6429 8.9286
0.8681 9.0 441 1.0105 0.6275 0.4279 0.571 0.5693 0.904 0.8932 0.8981 8.2723 18 3 12.5982 9.375
0.8661 10.0 490 0.9883 0.6266 0.4226 0.5687 0.5675 0.9038 0.8931 0.898 8.317 18 3 12.6161 10.7143
0.8606 11.0 539 0.9717 0.629 0.4283 0.5717 0.5702 0.9052 0.8934 0.8988 8.1875 17 3 12.4509 8.4821
0.8701 12.0 588 0.9535 0.635 0.436 0.581 0.5799 0.9081 0.8941 0.9006 7.9062 15 3 12.183 6.25
0.8449 13.0 637 0.9394 0.6381 0.4373 0.5846 0.5831 0.9088 0.8955 0.9016 7.9196 15 3 12.1696 5.3571
0.8328 14.0 686 0.9270 0.6405 0.4455 0.5868 0.586 0.9083 0.8959 0.9016 7.9554 15 3 12.183 5.3571
0.8448 15.0 735 0.9135 0.6449 0.4548 0.594 0.5926 0.909 0.8986 0.9033 8.0625 16 3 12.3616 5.8036
0.8107 16.0 784 0.9028 0.6435 0.4484 0.5876 0.5868 0.9092 0.8979 0.9031 7.9911 15 3 12.25 4.9107
0.831 17.0 833 0.8949 0.6458 0.4525 0.59 0.5887 0.9095 0.8989 0.9037 8.0491 15 3 12.308 5.3571
0.8324 18.0 882 0.8849 0.6477 0.4495 0.5888 0.5874 0.9103 0.8989 0.9041 8.0491 15 3 12.3259 5.3571
0.8404 19.0 931 0.8783 0.6522 0.4531 0.5915 0.5906 0.9109 0.8996 0.9048 8.0938 15 3 12.3795 5.8036
0.8152 20.0 980 0.8694 0.6523 0.4545 0.5926 0.5921 0.9119 0.8996 0.9053 7.9821 15 3 12.2321 4.9107
0.802 21.0 1029 0.8654 0.6559 0.4572 0.5954 0.5951 0.9117 0.9002 0.9055 8.0223 15 3 12.2455 4.9107
0.8094 22.0 1078 0.8579 0.659 0.4557 0.5984 0.5982 0.9123 0.9012 0.9063 8.0536 15 3 12.3393 5.3571
0.7734 23.0 1127 0.8541 0.6576 0.4564 0.5971 0.597 0.9116 0.9015 0.9061 8.0848 15 3 12.3705 4.9107
0.775 24.0 1176 0.8490 0.661 0.4586 0.5999 0.5993 0.912 0.9019 0.9065 8.0759 15 3 12.3125 4.9107
0.7897 25.0 1225 0.8448 0.66 0.457 0.6007 0.5997 0.9126 0.9011 0.9064 8.0357 15 3 12.2902 4.4643
0.7817 26.0 1274 0.8409 0.6584 0.4557 0.5987 0.5982 0.9122 0.9006 0.906 7.9955 15 3 12.25 4.4643
0.7839 27.0 1323 0.8362 0.6612 0.4595 0.6015 0.601 0.9128 0.901 0.9065 7.9911 15 3 12.2545 4.4643
0.7964 28.0 1372 0.8317 0.6611 0.465 0.6048 0.604 0.9128 0.9018 0.9069 8.067 15 3 12.3393 4.4643
0.7634 29.0 1421 0.8282 0.6632 0.466 0.6052 0.6045 0.9133 0.9022 0.9074 8.0714 16 3 12.3438 4.4643
0.7939 30.0 1470 0.8250 0.6605 0.4617 0.6025 0.6019 0.913 0.9022 0.9072 8.0446 16 3 12.3482 4.9107
0.776 31.0 1519 0.8209 0.6645 0.4668 0.6073 0.6065 0.9133 0.9029 0.9077 8.0938 16 3 12.4062 5.8036
0.7511 32.0 1568 0.8192 0.6636 0.4652 0.6068 0.606 0.9128 0.9029 0.9074 8.1071 16 3 12.4152 6.25
0.7523 33.0 1617 0.8165 0.6638 0.4658 0.6067 0.6063 0.9126 0.9029 0.9073 8.1205 16 3 12.4286 6.25
0.7534 34.0 1666 0.8142 0.664 0.4684 0.6087 0.6079 0.9122 0.903 0.9072 8.1071 15 3 12.4196 6.25
0.7578 35.0 1715 0.8118 0.6621 0.4633 0.6039 0.6033 0.9117 0.9011 0.906 8.0759 15 3 12.3571 5.8036
0.7687 36.0 1764 0.8094 0.6615 0.4612 0.6035 0.6026 0.9116 0.9008 0.9058 8.0625 15 3 12.3304 5.8036
0.7423 37.0 1813 0.8075 0.6607 0.4605 0.6028 0.6022 0.9114 0.9009 0.9057 8.0714 15 3 12.3482 5.8036
0.766 38.0 1862 0.8056 0.6593 0.4591 0.6027 0.6021 0.9111 0.9008 0.9055 8.0848 15 3 12.3705 6.25
0.7422 39.0 1911 0.8044 0.6616 0.4605 0.6021 0.6014 0.9109 0.901 0.9055 8.0893 16 3 12.3795 5.8036
0.754 40.0 1960 0.8029 0.6629 0.4595 0.6016 0.6012 0.9111 0.9009 0.9055 8.0446 16 3 12.3259 5.3571
0.7326 41.0 2009 0.8017 0.6637 0.4602 0.6024 0.6018 0.911 0.9011 0.9056 8.0625 16 3 12.3482 5.3571
0.7847 42.0 2058 0.8008 0.6637 0.4602 0.6024 0.6018 0.911 0.9011 0.9056 8.0625 16 3 12.3482 5.3571
0.7426 43.0 2107 0.7997 0.664 0.4604 0.603 0.6023 0.911 0.901 0.9055 8.0536 16 3 12.3393 4.9107
0.7476 44.0 2156 0.7990 0.6666 0.4628 0.6057 0.6051 0.9115 0.9014 0.906 8.0357 16 3 12.317 4.4643
0.752 45.0 2205 0.7983 0.6666 0.4629 0.6057 0.6053 0.9116 0.9014 0.906 8.0312 16 3 12.3125 4.4643
0.7256 46.0 2254 0.7979 0.6661 0.4623 0.6049 0.6047 0.9115 0.901 0.9058 8.0089 16 3 12.2902 4.4643
0.752 47.0 2303 0.7974 0.6642 0.4623 0.6044 0.604 0.9111 0.9008 0.9055 8.0312 16 3 12.317 4.4643
0.7503 48.0 2352 0.7971 0.6672 0.4657 0.6067 0.6067 0.9113 0.9013 0.9059 8.058 16 3 12.3438 4.4643
0.7515 49.0 2401 0.7970 0.6672 0.4657 0.6067 0.6067 0.9113 0.9013 0.9059 8.058 16 3 12.3438 4.4643
0.7312 50.0 2450 0.7969 0.6672 0.4657 0.6067 0.6067 0.9113 0.9013 0.9059 8.058 16 3 12.3438 4.4643

Framework versions