generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

meQ_model

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
2.835 1.0 57 2.0475 0.2345 0.0895 0.2074 0.2067 16.82
2.3806 2.0 114 1.7243 0.2854 0.126 0.2649 0.2667 15.83
2.1701 3.0 171 1.5659 0.3633 0.211 0.3445 0.3444 14.72
2.0194 4.0 228 1.4690 0.4301 0.2752 0.4115 0.4127 13.95
1.9218 5.0 285 1.4159 0.468 0.3125 0.4499 0.4493 13.26
1.8645 6.0 342 1.3811 0.487 0.3462 0.472 0.4715 13.22
1.8254 7.0 399 1.3494 0.4836 0.3424 0.4668 0.466 13.17
1.7843 8.0 456 1.3277 0.4818 0.3444 0.4676 0.4672 13.04
1.7543 9.0 513 1.3084 0.4775 0.3412 0.463 0.4627 13.15
1.7206 10.0 570 1.2961 0.476 0.3393 0.4602 0.4617 13.13
1.6977 11.0 627 1.2823 0.4762 0.3395 0.4603 0.462 13.18
1.6725 12.0 684 1.2701 0.4841 0.3437 0.4677 0.4694 13.28
1.6479 13.0 741 1.2649 0.4912 0.3505 0.4755 0.4778 13.37
1.6313 14.0 798 1.2546 0.4896 0.344 0.4724 0.4742 13.47
1.6154 15.0 855 1.2488 0.4898 0.3456 0.4738 0.476 13.48
1.5932 16.0 912 1.2433 0.4935 0.3506 0.4776 0.4806 13.47
1.5716 17.0 969 1.2347 0.4984 0.3529 0.4789 0.4815 13.46
1.5523 18.0 1026 1.2314 0.4881 0.3456 0.4713 0.4722 13.48
1.5393 19.0 1083 1.2277 0.4925 0.35 0.4754 0.4761 13.57
1.535 20.0 1140 1.2239 0.4866 0.3415 0.4693 0.4708 13.63
1.5389 21.0 1197 1.2178 0.4785 0.3359 0.463 0.4621 13.56
1.5203 22.0 1254 1.2132 0.4837 0.3362 0.4679 0.4682 13.75
1.4909 23.0 1311 1.2098 0.4877 0.3393 0.4716 0.4719 13.71
1.4957 24.0 1368 1.2102 0.4874 0.3393 0.4713 0.4714 13.66
1.4746 25.0 1425 1.2076 0.4881 0.3398 0.4725 0.4717 13.66
1.4745 26.0 1482 1.2041 0.496 0.3474 0.4799 0.4792 13.64
1.4605 27.0 1539 1.2040 0.4903 0.3416 0.4741 0.4733 13.65
1.4465 28.0 1596 1.2024 0.4961 0.3461 0.4793 0.4784 13.7
1.4398 29.0 1653 1.2006 0.4859 0.3385 0.4692 0.4698 13.65
1.4469 30.0 1710 1.1976 0.4887 0.3426 0.473 0.4718 13.69
1.4218 31.0 1767 1.1965 0.4934 0.3469 0.4778 0.4764 13.64
1.4315 32.0 1824 1.1966 0.488 0.3447 0.4726 0.472 13.51
1.4282 33.0 1881 1.1957 0.488 0.3447 0.4726 0.472 13.51
1.396 34.0 1938 1.1932 0.489 0.3459 0.4739 0.4729 13.52
1.4028 35.0 1995 1.1941 0.4892 0.3434 0.4723 0.4722 13.63
1.4068 36.0 2052 1.1922 0.4895 0.347 0.4733 0.4722 13.57
1.3831 37.0 2109 1.1911 0.4927 0.3451 0.4742 0.474 13.63
1.3781 38.0 2166 1.1903 0.4896 0.3434 0.4717 0.4714 13.57
1.3867 39.0 2223 1.1889 0.4915 0.3464 0.4736 0.4729 13.63
1.3694 40.0 2280 1.1893 0.492 0.3444 0.4728 0.4723 13.58
1.3912 41.0 2337 1.1891 0.4902 0.3448 0.4719 0.4713 13.46
1.3793 42.0 2394 1.1886 0.492 0.3444 0.4728 0.4723 13.58
1.3664 43.0 2451 1.1884 0.4907 0.3434 0.4717 0.4714 13.53
1.3787 44.0 2508 1.1874 0.4919 0.3442 0.4725 0.472 13.61
1.3692 45.0 2565 1.1871 0.4919 0.3442 0.4725 0.472 13.61
1.3732 46.0 2622 1.1875 0.492 0.3444 0.4728 0.4723 13.58
1.3752 47.0 2679 1.1872 0.4942 0.346 0.4743 0.474 13.61
1.3581 48.0 2736 1.1871 0.4942 0.346 0.4743 0.474 13.61
1.3509 49.0 2793 1.1869 0.4942 0.346 0.4743 0.474 13.61
1.3752 50.0 2850 1.1869 0.4942 0.346 0.4743 0.474 13.61

Framework versions