generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

clinico-xlm-roberta

This model is a fine-tuned version of xlm-roberta-base on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 25 1.4128 0.0015 0.0023 0.0018 0.5371
No log 2.0 50 1.0675 0.0260 0.0595 0.0362 0.6283
No log 3.0 75 0.9345 0.0456 0.1213 0.0663 0.6667
No log 4.0 100 0.8709 0.0707 0.1465 0.0954 0.7060
No log 5.0 125 0.8154 0.1112 0.2059 0.1444 0.7399
No log 6.0 150 0.8001 0.1615 0.3066 0.2116 0.7347
No log 7.0 175 0.6928 0.2129 0.3616 0.2680 0.7846
No log 8.0 200 0.6576 0.2210 0.3753 0.2782 0.7988
No log 9.0 225 0.6174 0.2263 0.4119 0.2921 0.8120
No log 10.0 250 0.6232 0.2385 0.4268 0.3060 0.8150
No log 11.0 275 0.6304 0.2528 0.4577 0.3257 0.8237
No log 12.0 300 0.6562 0.2758 0.4714 0.3480 0.8204
No log 13.0 325 0.6725 0.2726 0.4828 0.3485 0.8164
No log 14.0 350 0.6959 0.2732 0.4943 0.3519 0.8216
No log 15.0 375 0.6838 0.2919 0.5046 0.3698 0.8313
No log 16.0 400 0.7033 0.3199 0.5252 0.3976 0.8307
No log 17.0 425 0.7198 0.2848 0.4897 0.3601 0.8094
No log 18.0 450 0.7319 0.3070 0.5149 0.3846 0.8293
No log 19.0 475 0.7841 0.3184 0.5275 0.3971 0.8283
0.5744 20.0 500 0.8119 0.2995 0.5229 0.3808 0.8204
0.5744 21.0 525 0.7665 0.2914 0.5069 0.3701 0.8228
0.5744 22.0 550 0.8008 0.3062 0.5172 0.3847 0.8201
0.5744 23.0 575 0.7822 0.3008 0.5217 0.3816 0.8294
0.5744 24.0 600 0.8432 0.3148 0.5114 0.3897 0.8191
0.5744 25.0 625 0.8161 0.3387 0.5309 0.4135 0.8332
0.5744 26.0 650 0.8405 0.3289 0.5423 0.4095 0.8275
0.5744 27.0 675 0.8273 0.3465 0.5435 0.4232 0.8311
0.5744 28.0 700 0.8920 0.3326 0.5446 0.4130 0.8309
0.5744 29.0 725 0.8796 0.3303 0.5400 0.4099 0.8344
0.5744 30.0 750 0.8918 0.3319 0.5229 0.4060 0.8246
0.5744 31.0 775 0.8656 0.3613 0.5618 0.4398 0.8381
0.5744 32.0 800 0.9315 0.3375 0.5503 0.4184 0.8260
0.5744 33.0 825 0.9042 0.3644 0.5686 0.4441 0.8339
0.5744 34.0 850 0.9060 0.3865 0.5652 0.4591 0.8387
0.5744 35.0 875 0.9413 0.4021 0.5778 0.4742 0.8360
0.5744 36.0 900 0.9608 0.3634 0.5629 0.4417 0.8337
0.5744 37.0 925 0.8908 0.3536 0.5526 0.4313 0.8355
0.5744 38.0 950 0.9339 0.3543 0.5744 0.4382 0.8360
0.5744 39.0 975 0.9853 0.3751 0.5721 0.4531 0.8416
0.068 40.0 1000 0.9807 0.4005 0.5847 0.4753 0.8352
0.068 41.0 1025 1.0515 0.3953 0.5641 0.4649 0.8290
0.068 42.0 1050 0.9588 0.3912 0.5778 0.4665 0.8400
0.068 43.0 1075 0.9839 0.3888 0.5858 0.4674 0.8381
0.068 44.0 1100 1.0556 0.4092 0.5721 0.4771 0.8341
0.068 45.0 1125 0.9591 0.4097 0.5892 0.4833 0.8433
0.068 46.0 1150 1.0339 0.4057 0.5904 0.4809 0.8337
0.068 47.0 1175 1.0162 0.3871 0.5904 0.4676 0.8438
0.068 48.0 1200 1.0642 0.3864 0.5858 0.4657 0.8348
0.068 49.0 1225 1.0270 0.4257 0.5904 0.4947 0.8464
0.068 50.0 1250 1.0872 0.4126 0.6053 0.4907 0.8390
0.068 51.0 1275 1.0346 0.4086 0.5904 0.4829 0.8437
0.068 52.0 1300 1.0785 0.4131 0.6007 0.4895 0.8389
0.068 53.0 1325 1.0533 0.4380 0.5984 0.5058 0.8433
0.068 54.0 1350 1.0574 0.4109 0.5961 0.4865 0.8430
0.068 55.0 1375 1.1087 0.4166 0.5973 0.4908 0.8417
0.068 56.0 1400 1.0861 0.4140 0.5870 0.4856 0.8398
0.068 57.0 1425 1.0796 0.4085 0.6053 0.4878 0.8442
0.068 58.0 1450 1.1179 0.4208 0.6053 0.4965 0.8383
0.068 59.0 1475 1.1096 0.3950 0.5915 0.4737 0.8416
0.0173 60.0 1500 1.0741 0.4518 0.6167 0.5215 0.8440
0.0173 61.0 1525 1.0957 0.4536 0.6098 0.5203 0.8423
0.0173 62.0 1550 1.1131 0.4581 0.5881 0.5150 0.8455
0.0173 63.0 1575 1.0809 0.4367 0.6156 0.5109 0.8499
0.0173 64.0 1600 1.1138 0.4439 0.5927 0.5076 0.8419
0.0173 65.0 1625 1.1543 0.4100 0.5995 0.4870 0.8394
0.0173 66.0 1650 1.1292 0.4256 0.6087 0.5009 0.8432
0.0173 67.0 1675 1.1415 0.4542 0.6064 0.5194 0.8461
0.0173 68.0 1700 1.1804 0.4300 0.6007 0.5012 0.8436
0.0173 69.0 1725 1.1676 0.4356 0.5995 0.5046 0.8437
0.0173 70.0 1750 1.1806 0.4316 0.5961 0.5007 0.8420
0.0173 71.0 1775 1.1530 0.435 0.5973 0.5034 0.8459
0.0173 72.0 1800 1.1691 0.4344 0.5984 0.5034 0.8435
0.0173 73.0 1825 1.1869 0.4242 0.5927 0.4945 0.8410
0.0173 74.0 1850 1.1868 0.4450 0.5927 0.5083 0.8395
0.0173 75.0 1875 1.1987 0.4458 0.6064 0.5138 0.8398
0.0173 76.0 1900 1.1936 0.4396 0.5870 0.5027 0.8392
0.0173 77.0 1925 1.1882 0.4433 0.5950 0.5081 0.8414
0.0173 78.0 1950 1.2038 0.4387 0.5938 0.5046 0.8413
0.0173 79.0 1975 1.2103 0.4417 0.5984 0.5083 0.8403
0.0056 80.0 2000 1.2062 0.4259 0.5915 0.4952 0.8394
0.0056 81.0 2025 1.1871 0.4536 0.5984 0.5160 0.8425
0.0056 82.0 2050 1.1944 0.4268 0.6007 0.4990 0.8416
0.0056 83.0 2075 1.1941 0.4549 0.6007 0.5178 0.8447
0.0056 84.0 2100 1.2032 0.4553 0.6007 0.5180 0.8436
0.0056 85.0 2125 1.2096 0.4420 0.6018 0.5097 0.8414
0.0056 86.0 2150 1.2011 0.4333 0.6018 0.5038 0.8401
0.0056 87.0 2175 1.2329 0.4511 0.5961 0.5136 0.8411
0.0056 88.0 2200 1.2134 0.4523 0.6018 0.5164 0.8429
0.0056 89.0 2225 1.2281 0.4410 0.5984 0.5078 0.8426
0.0056 90.0 2250 1.2284 0.4490 0.6041 0.5151 0.8416
0.0056 91.0 2275 1.2129 0.435 0.5973 0.5034 0.8438
0.0056 92.0 2300 1.2164 0.4387 0.5973 0.5058 0.8428
0.0056 93.0 2325 1.2177 0.4429 0.6030 0.5107 0.8433
0.0056 94.0 2350 1.2297 0.4545 0.6053 0.5191 0.8434
0.0056 95.0 2375 1.2243 0.4579 0.6030 0.5205 0.8459
0.0056 96.0 2400 1.2241 0.4478 0.6041 0.5144 0.8457
0.0056 97.0 2425 1.2286 0.4496 0.6018 0.5147 0.8434
0.0056 98.0 2450 1.2279 0.4426 0.5995 0.5092 0.8431
0.0056 99.0 2475 1.2244 0.4328 0.6007 0.5031 0.8443
0.0029 100.0 2500 1.2239 0.4333 0.5984 0.5026 0.8446

Framework versions