generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

clinico-xlm-roberta-large-finetuned

This model is a fine-tuned version of joheras/xlm-roberta-large-finetuned-clinais on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 49 0.5993 0.2242 0.4577 0.3010 0.8168
No log 2.0 98 0.5762 0.2365 0.4668 0.3140 0.8280
No log 3.0 147 0.5627 0.2460 0.4989 0.3296 0.8326
No log 4.0 196 0.5260 0.3029 0.5435 0.3890 0.8534
No log 5.0 245 0.5838 0.3058 0.5343 0.3890 0.8559
No log 6.0 294 0.6085 0.3369 0.5686 0.4232 0.8545
No log 7.0 343 0.6481 0.3532 0.5824 0.4397 0.8591
No log 8.0 392 0.6809 0.3523 0.5744 0.4367 0.8580
No log 9.0 441 0.8040 0.3864 0.5778 0.4631 0.8568
No log 10.0 490 0.7505 0.3775 0.5870 0.4595 0.8635
0.3143 11.0 539 0.8028 0.4350 0.6007 0.5046 0.8610
0.3143 12.0 588 0.8103 0.4253 0.6190 0.5042 0.8672
0.3143 13.0 637 0.8302 0.4506 0.6362 0.5275 0.8637
0.3143 14.0 686 0.9385 0.4581 0.6247 0.5286 0.8583
0.3143 15.0 735 0.9407 0.4304 0.6156 0.5066 0.8639
0.3143 16.0 784 0.9105 0.4421 0.6201 0.5162 0.8679
0.3143 17.0 833 0.9616 0.4691 0.6247 0.5358 0.8654
0.3143 18.0 882 0.9695 0.4799 0.6281 0.5441 0.8680
0.3143 19.0 931 1.0195 0.4996 0.6396 0.5610 0.8735
0.3143 20.0 980 1.0073 0.4670 0.6224 0.5336 0.8642
0.0259 21.0 1029 1.0354 0.4783 0.6316 0.5444 0.8673
0.0259 22.0 1078 1.1327 0.5258 0.6419 0.5781 0.8646
0.0259 23.0 1127 1.0605 0.5055 0.6281 0.5602 0.8668
0.0259 24.0 1176 1.0120 0.5158 0.6350 0.5692 0.8657
0.0259 25.0 1225 1.0205 0.4920 0.6339 0.5540 0.8729
0.0259 26.0 1274 1.0583 0.4995 0.6259 0.5556 0.8688
0.0259 27.0 1323 1.1157 0.5066 0.6545 0.5711 0.8698
0.0259 28.0 1372 1.1049 0.5048 0.6568 0.5709 0.8694
0.0259 29.0 1421 1.1167 0.4978 0.6487 0.5633 0.8685
0.0259 30.0 1470 1.1614 0.5 0.6625 0.5699 0.8644
0.0062 31.0 1519 1.1521 0.4991 0.6453 0.5629 0.8647
0.0062 32.0 1568 1.1951 0.4938 0.6419 0.5582 0.8661
0.0062 33.0 1617 1.2044 0.4815 0.6419 0.5503 0.8676
0.0062 34.0 1666 1.1952 0.5242 0.6556 0.5826 0.8712
0.0062 35.0 1715 1.1598 0.5283 0.6625 0.5878 0.8768
0.0062 36.0 1764 1.1716 0.5221 0.6613 0.5835 0.8720
0.0062 37.0 1813 1.2127 0.5236 0.6465 0.5786 0.8707
0.0062 38.0 1862 1.2747 0.5259 0.6499 0.5814 0.8692
0.0062 39.0 1911 1.2397 0.5363 0.6590 0.5914 0.8676
0.0062 40.0 1960 1.2358 0.5477 0.6568 0.5973 0.8746
0.0014 41.0 2009 1.2332 0.5367 0.6602 0.5921 0.8745
0.0014 42.0 2058 1.2239 0.5106 0.6602 0.5758 0.8685
0.0014 43.0 2107 1.2163 0.5224 0.6533 0.5806 0.8679
0.0014 44.0 2156 1.2335 0.5349 0.6568 0.5896 0.8694
0.0014 45.0 2205 1.3374 0.5348 0.6236 0.5758 0.8680
0.0014 46.0 2254 1.2287 0.5417 0.6533 0.5923 0.8730
0.0014 47.0 2303 1.2268 0.5536 0.6796 0.6102 0.8789
0.0014 48.0 2352 1.2153 0.4974 0.6568 0.5661 0.8737
0.0014 49.0 2401 1.2180 0.5222 0.6590 0.5827 0.8747
0.0014 50.0 2450 1.2906 0.5500 0.6476 0.5949 0.8698
0.0014 51.0 2499 1.2547 0.5386 0.6384 0.5843 0.8686
0.0018 52.0 2548 1.2792 0.5307 0.6430 0.5815 0.8681
0.0018 53.0 2597 1.1972 0.5040 0.6510 0.5681 0.8705
0.0018 54.0 2646 1.2189 0.5215 0.6533 0.5800 0.8782
0.0018 55.0 2695 1.2239 0.5602 0.6602 0.6061 0.8789
0.0018 56.0 2744 1.2620 0.5410 0.6648 0.5965 0.8773
0.0018 57.0 2793 1.2828 0.5513 0.6522 0.5975 0.8747
0.0018 58.0 2842 1.2633 0.5518 0.6522 0.5978 0.8749
0.0018 59.0 2891 1.2619 0.5356 0.6796 0.5991 0.8738
0.0018 60.0 2940 1.2076 0.5385 0.6716 0.5978 0.8775
0.0018 61.0 2989 1.2996 0.5357 0.6442 0.5849 0.8686
0.0011 62.0 3038 1.2614 0.5483 0.6693 0.6028 0.8773
0.0011 63.0 3087 1.2713 0.5524 0.6579 0.6005 0.8757
0.0011 64.0 3136 1.2920 0.5550 0.6579 0.6021 0.8739
0.0011 65.0 3185 1.3319 0.5623 0.6716 0.6121 0.8713
0.0011 66.0 3234 1.3345 0.5433 0.6533 0.5932 0.8720
0.0011 67.0 3283 1.3146 0.5305 0.6465 0.5828 0.8657
0.0011 68.0 3332 1.3354 0.5452 0.6556 0.5953 0.8691
0.0011 69.0 3381 1.3474 0.5519 0.6693 0.6050 0.8759
0.0011 70.0 3430 1.3498 0.5403 0.6590 0.5938 0.8686
0.0011 71.0 3479 1.3340 0.5387 0.6602 0.5933 0.8749
0.0005 72.0 3528 1.3475 0.5615 0.6636 0.6083 0.8745
0.0005 73.0 3577 1.3530 0.5425 0.6648 0.5974 0.8746
0.0005 74.0 3626 1.3494 0.5491 0.6648 0.6014 0.8738
0.0005 75.0 3675 1.3368 0.5620 0.6590 0.6066 0.8749
0.0005 76.0 3724 1.3382 0.5467 0.6625 0.5991 0.8752
0.0005 77.0 3773 1.3486 0.5377 0.6533 0.5899 0.8759
0.0005 78.0 3822 1.3485 0.5483 0.6499 0.5948 0.8731
0.0005 79.0 3871 1.3512 0.5340 0.6556 0.5886 0.8751
0.0005 80.0 3920 1.3486 0.5513 0.6636 0.6023 0.8772
0.0005 81.0 3969 1.3530 0.5481 0.6579 0.5980 0.8772
0.0001 82.0 4018 1.3940 0.5536 0.6499 0.5979 0.8751
0.0001 83.0 4067 1.3657 0.5296 0.6453 0.5817 0.8742
0.0001 84.0 4116 1.3538 0.5412 0.6384 0.5858 0.8719
0.0001 85.0 4165 1.3550 0.5418 0.6373 0.5857 0.8693
0.0001 86.0 4214 1.3810 0.5187 0.6362 0.5714 0.8685
0.0001 87.0 4263 1.3625 0.5370 0.6396 0.5838 0.8707
0.0001 88.0 4312 1.3605 0.5389 0.6419 0.5859 0.8712
0.0001 89.0 4361 1.3616 0.5388 0.6430 0.5863 0.8711
0.0001 90.0 4410 1.3560 0.5431 0.6419 0.5884 0.8719
0.0001 91.0 4459 1.3558 0.5399 0.6430 0.5869 0.8716
0.0001 92.0 4508 1.3586 0.5342 0.6430 0.5836 0.8717
0.0001 93.0 4557 1.3727 0.5349 0.6487 0.5863 0.8716
0.0001 94.0 4606 1.3810 0.5539 0.6465 0.5966 0.8707
0.0001 95.0 4655 1.3813 0.5540 0.6453 0.5962 0.8705
0.0001 96.0 4704 1.3879 0.5625 0.6487 0.6026 0.8699
0.0001 97.0 4753 1.3886 0.5614 0.6487 0.6019 0.8702
0.0001 98.0 4802 1.3811 0.5470 0.6453 0.5921 0.8704
0.0001 99.0 4851 1.3788 0.5407 0.6453 0.5884 0.8709
0.0001 100.0 4900 1.3788 0.5418 0.6453 0.5890 0.8708

Framework versions