generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

clinico-xlm-roberta-large

This model is a fine-tuned version of xlm-roberta-large on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 49 0.6619 0.1116 0.3455 0.1687 0.8029
No log 2.0 98 0.5662 0.2196 0.4497 0.2950 0.8296
No log 3.0 147 0.4912 0.2585 0.5069 0.3423 0.8469
No log 4.0 196 0.5450 0.3165 0.5538 0.4028 0.8522
No log 5.0 245 0.5523 0.3185 0.5309 0.3981 0.8578
No log 6.0 294 0.6129 0.3057 0.5561 0.3945 0.8449
No log 7.0 343 0.5760 0.3295 0.5595 0.4148 0.8523
No log 8.0 392 0.6572 0.3749 0.5950 0.4600 0.8657
No log 9.0 441 0.7340 0.3695 0.5915 0.4549 0.8603
No log 10.0 490 0.8306 0.3494 0.5572 0.4295 0.8558
0.3262 11.0 539 0.8389 0.3714 0.5732 0.4507 0.8599
0.3262 12.0 588 0.8278 0.3880 0.5767 0.4639 0.8479
0.3262 13.0 637 0.8057 0.4038 0.6076 0.4852 0.8660
0.3262 14.0 686 0.8489 0.3847 0.5915 0.4662 0.8619
0.3262 15.0 735 0.8954 0.3868 0.5961 0.4692 0.8594
0.3262 16.0 784 0.8951 0.3926 0.5835 0.4694 0.8594
0.3262 17.0 833 0.9715 0.4080 0.5961 0.4844 0.8625
0.3262 18.0 882 0.9600 0.4317 0.6144 0.5071 0.8652
0.3262 19.0 931 0.9335 0.4369 0.6224 0.5134 0.8682
0.3262 20.0 980 0.8988 0.4178 0.6110 0.4963 0.8656
0.0323 21.0 1029 1.0445 0.4410 0.6110 0.5122 0.8637
0.0323 22.0 1078 0.9596 0.5078 0.6339 0.5639 0.8680
0.0323 23.0 1127 1.0240 0.4810 0.6384 0.5487 0.8643
0.0323 24.0 1176 1.0528 0.5367 0.6613 0.5925 0.8667
0.0323 25.0 1225 1.0788 0.5128 0.6648 0.5790 0.8713
0.0323 26.0 1274 1.0661 0.5268 0.6533 0.5832 0.8729
0.0323 27.0 1323 1.1575 0.5276 0.6568 0.5851 0.8733
0.0323 28.0 1372 1.2267 0.4929 0.6350 0.5550 0.8553
0.0323 29.0 1421 1.0935 0.5187 0.6499 0.5769 0.8718
0.0323 30.0 1470 1.2093 0.5162 0.6556 0.5776 0.8676
0.0074 31.0 1519 1.1556 0.5227 0.6590 0.5830 0.8750
0.0074 32.0 1568 1.2110 0.5632 0.6579 0.6069 0.8685
0.0074 33.0 1617 1.2201 0.5273 0.6304 0.5743 0.8645
0.0074 34.0 1666 1.1884 0.5167 0.6533 0.5771 0.8692
0.0074 35.0 1715 1.2731 0.5125 0.6327 0.5663 0.8691
0.0074 36.0 1764 1.2366 0.5054 0.6396 0.5646 0.8622
0.0074 37.0 1813 1.2428 0.5257 0.6545 0.5831 0.8697
0.0074 38.0 1862 1.2853 0.5299 0.6281 0.5749 0.8612
0.0074 39.0 1911 1.2748 0.5260 0.6362 0.5759 0.8604
0.0074 40.0 1960 1.3006 0.5387 0.6533 0.5905 0.8625
0.0022 41.0 2009 1.3935 0.5217 0.6339 0.5723 0.8566
0.0022 42.0 2058 1.2644 0.5154 0.6510 0.5753 0.8646
0.0022 43.0 2107 1.3069 0.5160 0.6259 0.5657 0.8658
0.0022 44.0 2156 1.3047 0.5161 0.6419 0.5722 0.8665
0.0022 45.0 2205 1.3570 0.5352 0.6350 0.5808 0.8620
0.0022 46.0 2254 1.2924 0.5239 0.6384 0.5756 0.8662
0.0022 47.0 2303 1.3362 0.5247 0.6568 0.5833 0.8622
0.0022 48.0 2352 1.3201 0.5301 0.6545 0.5858 0.8651
0.0022 49.0 2401 1.3418 0.5318 0.6407 0.5812 0.8674
0.0022 50.0 2450 1.3468 0.5005 0.6304 0.5580 0.8658
0.0022 51.0 2499 1.4094 0.5403 0.6373 0.5848 0.8573
0.0011 52.0 2548 1.3697 0.5307 0.6430 0.5815 0.8648
0.0011 53.0 2597 1.3840 0.5519 0.6384 0.5920 0.8609
0.0011 54.0 2646 1.3421 0.5415 0.6487 0.5903 0.8660
0.0011 55.0 2695 1.3011 0.5416 0.6556 0.5932 0.8696
0.0011 56.0 2744 1.3487 0.5491 0.6522 0.5962 0.8672
0.0011 57.0 2793 1.3309 0.5627 0.6465 0.6017 0.8641
0.0011 58.0 2842 1.3432 0.5376 0.6384 0.5837 0.8658
0.0011 59.0 2891 1.3824 0.5547 0.6327 0.5911 0.8660
0.0011 60.0 2940 1.3315 0.5135 0.6327 0.5669 0.8639
0.0011 61.0 2989 1.3656 0.5272 0.6327 0.5751 0.8637
0.0009 62.0 3038 1.3466 0.5369 0.6327 0.5809 0.8626
0.0009 63.0 3087 1.3103 0.5198 0.6453 0.5758 0.8645
0.0009 64.0 3136 1.4302 0.5304 0.6396 0.5799 0.8559
0.0009 65.0 3185 1.4510 0.5350 0.6476 0.5859 0.8598
0.0009 66.0 3234 1.3478 0.5196 0.6384 0.5729 0.8656
0.0009 67.0 3283 1.4041 0.5436 0.6350 0.5858 0.8636
0.0009 68.0 3332 1.3659 0.5673 0.6362 0.5998 0.8702
0.0009 69.0 3381 1.3418 0.5473 0.6419 0.5908 0.8702
0.0009 70.0 3430 1.3634 0.5402 0.6384 0.5852 0.8657
0.0009 71.0 3479 1.4288 0.5523 0.6465 0.5957 0.8613
0.0008 72.0 3528 1.3958 0.5413 0.6304 0.5825 0.8643
0.0008 73.0 3577 1.4010 0.5344 0.6316 0.5789 0.8683
0.0008 74.0 3626 1.3712 0.5361 0.6453 0.5857 0.8663
0.0008 75.0 3675 1.3434 0.5325 0.6465 0.5840 0.8708
0.0008 76.0 3724 1.3502 0.5140 0.6304 0.5663 0.8682
0.0008 77.0 3773 1.3639 0.5330 0.6373 0.5805 0.8691
0.0008 78.0 3822 1.3515 0.5167 0.6373 0.5707 0.8697
0.0008 79.0 3871 1.3677 0.5228 0.6430 0.5767 0.8691
0.0008 80.0 3920 1.4069 0.5401 0.6396 0.5856 0.8672
0.0008 81.0 3969 1.3813 0.5307 0.6522 0.5852 0.8672
0.0002 82.0 4018 1.3773 0.5355 0.6476 0.5862 0.8678
0.0002 83.0 4067 1.4004 0.5279 0.6281 0.5737 0.8674
0.0002 84.0 4116 1.4027 0.5532 0.6487 0.5972 0.8696
0.0002 85.0 4165 1.3544 0.5351 0.6362 0.5813 0.8672
0.0002 86.0 4214 1.3582 0.5367 0.6362 0.5822 0.8664
0.0002 87.0 4263 1.3594 0.5300 0.6362 0.5783 0.8666
0.0002 88.0 4312 1.3737 0.5371 0.6384 0.5834 0.8690
0.0002 89.0 4361 1.3991 0.5368 0.6339 0.5813 0.8688
0.0002 90.0 4410 1.3819 0.5400 0.6407 0.5861 0.8679
0.0002 91.0 4459 1.3900 0.5495 0.6419 0.5921 0.8699
0.0001 92.0 4508 1.3890 0.5509 0.6442 0.5939 0.8703
0.0001 93.0 4557 1.3825 0.5336 0.6442 0.5837 0.8686
0.0001 94.0 4606 1.3821 0.5370 0.6476 0.5871 0.8686
0.0001 95.0 4655 1.3803 0.5437 0.6476 0.5911 0.8685
0.0001 96.0 4704 1.3747 0.5395 0.6487 0.5891 0.8688
0.0001 97.0 4753 1.3752 0.5223 0.6442 0.5768 0.8683
0.0001 98.0 4802 1.3775 0.5280 0.6465 0.5813 0.8685
0.0001 99.0 4851 1.3784 0.5300 0.6465 0.5825 0.8685
0.0001 100.0 4900 1.3753 0.5315 0.6465 0.5834 0.8688

Framework versions