generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

text_shortening_model_v39

This model is a fine-tuned version of facebook/bart-large-xsum on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Average word count Max word count Min word count Average token count % shortened texts with length > 12
0.9582 1.0 73 1.4062 0.5229 0.2983 0.4739 0.4738 0.875 0.8853 8.9039 17 4 15.0811 9.009
0.5598 2.0 146 1.4819 0.5053 0.2806 0.456 0.4561 0.8723 0.879 8.6486 14 5 14.2703 1.5015
0.3791 3.0 219 1.7718 0.5174 0.2882 0.4532 0.4539 0.8705 0.8834 9.6456 18 5 17.7027 16.5165
0.3748 4.0 292 2.1513 0.3078 0.1184 0.2773 0.278 0.8215 0.8336 9.5375 18 4 17.1441 9.9099
0.2837 5.0 365 1.6757 0.4999 0.2661 0.4487 0.4489 0.8732 0.8766 8.3844 16 4 15.1892 6.6066
0.1885 6.0 438 1.8005 0.4938 0.2619 0.4437 0.4439 0.8729 0.8763 8.5526 14 5 14.994 1.5015
0.1799 7.0 511 1.8427 0.4986 0.2752 0.4455 0.4463 0.8664 0.8796 9.4384 20 5 15.6697 11.4114
0.1638 8.0 584 2.0234 0.5206 0.2854 0.4632 0.4642 0.8774 0.8844 9.1682 18 4 16.2132 9.9099
0.1247 9.0 657 1.9158 0.486 0.2628 0.4326 0.4339 0.8707 0.8758 8.7327 17 4 15.3093 6.6066
0.1059 10.0 730 2.2355 0.5127 0.2825 0.4578 0.4577 0.875 0.8827 9.045 17 4 16.5586 8.7087
0.1104 11.0 803 2.2555 0.5095 0.2698 0.4514 0.4511 0.8762 0.8815 8.7928 17 4 16.3123 8.7087
0.1196 12.0 876 2.3329 0.507 0.2692 0.453 0.454 0.8746 0.8795 8.8228 15 5 16.1862 5.4054
0.093 13.0 949 2.2657 0.5137 0.2748 0.4545 0.4543 0.8733 0.8801 8.7988 16 4 16.012 7.8078
0.0626 14.0 1022 2.5004 0.5014 0.2677 0.4432 0.4435 0.8725 0.8775 8.7508 16 5 16.4535 6.9069
0.0534 15.0 1095 2.4192 0.5031 0.27 0.4467 0.447 0.8711 0.8784 8.8438 19 4 16.1411 9.3093
0.0475 16.0 1168 2.5800 0.4891 0.2553 0.4313 0.4315 0.8689 0.8753 8.8408 18 4 16.5045 8.7087
0.0399 17.0 1241 2.6858 0.5021 0.2615 0.4452 0.445 0.8727 0.8782 8.7808 17 4 16.3844 7.2072
0.0296 18.0 1314 2.6646 0.4992 0.2666 0.4466 0.4463 0.8726 0.8764 8.5706 17 4 16.1111 4.8048
0.0286 19.0 1387 2.7496 0.5023 0.2648 0.4451 0.445 0.8721 0.8781 8.7868 17 4 16.3063 6.6066
0.026 20.0 1460 2.8730 0.4929 0.2546 0.4351 0.4353 0.8698 0.8762 8.8348 17 4 16.5796 8.4084

Framework versions