generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

text_shortening_model_v44

This model is a fine-tuned version of facebook/bart-large-xsum on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Average word count Max word count Min word count Average token count % shortened texts with length > 12
1.0083 1.0 83 1.4717 0.4904 0.2378 0.426 0.4266 0.8725 0.8732 8.5794 18 4 15.6164 6.3492
0.5702 2.0 166 1.4852 0.4722 0.2421 0.414 0.4143 0.869 0.8653 7.9101 14 4 13.6455 1.5873
0.4588 3.0 249 1.6283 0.5038 0.2733 0.4424 0.4422 0.8732 0.8794 9.0053 16 4 16.8386 8.9947
0.3586 4.0 332 1.6017 0.4965 0.2762 0.4381 0.4383 0.8709 0.8787 9.2381 18 4 16.3042 12.1693
0.2479 5.0 415 1.7497 0.4794 0.2613 0.4295 0.43 0.872 0.8702 8.3228 15 4 15.209 3.1746
0.2296 6.0 498 1.8482 0.4935 0.2739 0.4442 0.4443 0.8737 0.8755 8.7963 17 5 16.2989 7.1429
0.3065 7.0 581 1.9485 0.4765 0.2552 0.4213 0.4212 0.8698 0.8693 8.4683 17 5 15.6005 7.9365
0.2598 8.0 664 2.1608 0.4871 0.2585 0.4316 0.4319 0.8707 0.8736 8.963 16 5 16.6481 9.5238
0.2707 9.0 747 2.0966 0.4758 0.2603 0.4231 0.4246 0.8709 0.8717 8.4841 16 4 15.9312 7.1429
0.2099 10.0 830 2.2721 0.4777 0.2604 0.4246 0.4246 0.8735 0.8724 8.4312 15 4 15.9471 5.5556
0.1668 11.0 913 2.3536 0.4758 0.2541 0.4331 0.4328 0.8721 0.87 8.2857 14 4 15.7725 3.1746
0.1552 12.0 996 2.4572 0.484 0.2562 0.4313 0.4304 0.8726 0.875 8.828 17 4 16.246 7.9365
0.2141 13.0 1079 2.4485 0.4785 0.2631 0.4257 0.4252 0.8678 0.8736 9.1402 19 4 16.6561 11.3757
0.1348 14.0 1162 2.5012 0.4821 0.2613 0.4292 0.4296 0.8706 0.8738 8.8783 17 4 16.5185 10.0529
0.074 15.0 1245 2.5309 0.4915 0.2745 0.445 0.444 0.8764 0.8768 8.6667 16 4 16.2513 9.2593
0.1822 16.0 1328 2.5735 0.4709 0.2566 0.4239 0.4232 0.872 0.8692 8.2063 15 3 15.7249 4.2328
0.086 17.0 1411 2.8597 0.4831 0.2675 0.4373 0.4372 0.8722 0.8743 8.754 16 5 16.5476 8.7302
0.0872 18.0 1494 2.7420 0.4831 0.2677 0.4367 0.4353 0.8724 0.873 8.664 17 5 16.3016 7.672
0.1164 19.0 1577 2.8790 0.4867 0.269 0.4388 0.4381 0.8737 0.8755 8.7725 17 5 16.4418 8.9947
0.1101 20.0 1660 2.8836 0.4921 0.2719 0.4429 0.4423 0.8746 0.8761 8.7063 17 5 16.2989 8.7302

Framework versions