generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

kobart_8_4e-5_datav2_min30_lp5.0_temperature1.0

This model is a fine-tuned version of gogamza/kobart-base-v2 on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Bleu1 Bleu2 Bleu3 Bleu4 Gen Len
2.5571 0.19 1000 3.0256 30.6752 9.655 20.3793 24.9545 13.4562 7.0852 3.6167 47.2378
2.3748 0.38 2000 2.8633 33.6862 11.3467 21.6442 27.6602 15.5034 8.564 4.7708 52.5921
2.3327 0.57 3000 2.7965 34.1286 11.5936 22.3078 28.2895 15.9539 9.0344 5.0261 46.4336
2.2987 0.76 4000 2.7423 33.7844 11.4184 22.2715 27.9016 15.7678 8.887 4.9817 44.1305
2.2137 0.94 5000 2.6925 34.4899 12.4798 23.0933 28.5676 16.7234 9.854 5.4929 46.5431
2.0205 1.13 6000 2.6899 35.1651 12.2364 22.6918 29.561 16.9967 9.5871 5.4011 51.4126
1.9818 1.32 7000 2.7037 34.1708 12.01 22.3273 28.597 16.3676 9.6473 5.2881 48.0979
2.0085 1.51 8000 2.6568 35.1423 12.6615 23.3564 29.0896 16.9543 10.0793 5.8229 47.014
1.9972 1.7 9000 2.6399 35.3604 12.6992 23.3829 29.2344 17.0287 9.9469 5.5226 46.4336
1.963 1.89 10000 2.6225 34.992 12.3573 23.0134 29.0142 16.8063 9.6906 5.5045 51.4452
1.718 2.08 11000 2.6629 34.8932 12.2868 23.2794 28.7742 16.5584 9.6199 5.4499 47.5804
1.7171 2.27 12000 2.6648 35.4343 12.7376 23.4355 29.4051 17.1878 10.2903 5.824 46.4359
1.695 2.45 13000 2.6578 35.0225 12.1733 22.9686 28.8901 16.5961 9.3781 5.2049 49.0443
1.7282 2.64 14000 2.6435 33.9569 11.9783 22.9137 27.9425 16.0888 9.3867 5.3915 46.0886
1.7541 2.83 15000 2.6469 34.6347 12.1309 22.7496 28.9934 16.6886 9.7165 5.2098 49.62
1.4855 3.02 16000 2.7137 35.3936 12.7873 23.3762 29.4388 17.1262 10.0549 5.9223 50.0256
1.5382 3.21 17000 2.7161 35.211 12.7758 23.8604 29.1727 17.007 10.1639 6.0141 46.8159
1.5243 3.4 18000 2.7222 35.6339 12.683 23.5104 29.8071 17.3418 10.178 5.5185 49.5944
1.5265 3.59 19000 2.7210 35.4469 12.5754 23.3784 29.5035 17.1414 9.8427 5.5385 50.7762
1.5394 3.78 20000 2.7193 35.9595 12.9418 23.5227 30.0655 17.5487 10.115 5.6725 50.3357
1.5364 3.97 21000 2.7000 35.6398 12.9591 23.8267 29.9125 17.587 10.4197 5.985 48.4476
1.343 4.15 22000 2.7756 35.8172 12.7519 23.5584 29.7877 17.2715 10.219 5.9187 49.2984
1.3182 4.34 23000 2.7813 35.2382 12.7271 23.3914 29.5501 17.3306 10.3873 6.1428 50.8228
1.3771 4.53 24000 2.7716 35.4267 12.6279 23.3564 29.6336 17.245 10.2511 5.9128 51.8695
1.3522 4.72 25000 2.7700 35.8057 12.9656 23.6143 29.8501 17.475 10.2721 5.7671 50.6946
1.3508 4.91 26000 2.7690 35.7198 12.6777 23.5157 29.7798 17.2442 10.1198 5.5845 50.2914

Framework versions