generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

clinico-roberta-biomedical-finetuned-augmented1

This model is a fine-tuned version of joheras/roberta-base-biomedical-clinical-es-finetuned-clinais on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 90 0.5952 0.3894 0.4968 0.4366 0.8389
No log 2.0 180 0.5302 0.3351 0.4653 0.3896 0.8506
No log 3.0 270 0.5888 0.3925 0.5863 0.4702 0.8587
No log 4.0 360 0.6183 0.4100 0.6116 0.4909 0.8644
No log 5.0 450 0.6412 0.4702 0.6232 0.5360 0.8696
0.3362 6.0 540 0.7014 0.4808 0.6326 0.5464 0.8693
0.3362 7.0 630 0.7379 0.4500 0.6305 0.5252 0.8682
0.3362 8.0 720 0.7744 0.4722 0.6358 0.5419 0.8653
0.3362 9.0 810 0.7712 0.4896 0.6432 0.5560 0.8716
0.3362 10.0 900 0.7924 0.4904 0.6484 0.5585 0.8687
0.3362 11.0 990 0.8283 0.4984 0.6463 0.5628 0.8691
0.0237 12.0 1080 0.8286 0.5131 0.64 0.5696 0.8710
0.0237 13.0 1170 0.8492 0.5098 0.6558 0.5737 0.8687
0.0237 14.0 1260 0.8649 0.5137 0.6516 0.5745 0.8676
0.0237 15.0 1350 0.8748 0.5232 0.6526 0.5808 0.8702
0.0237 16.0 1440 0.8653 0.5183 0.6421 0.5736 0.8685
0.0086 17.0 1530 0.8938 0.5219 0.6526 0.5800 0.8722
0.0086 18.0 1620 0.9006 0.5083 0.6432 0.5678 0.8682
0.0086 19.0 1710 0.9220 0.5238 0.6484 0.5795 0.8693
0.0086 20.0 1800 0.8676 0.5151 0.6463 0.5733 0.8724
0.0086 21.0 1890 0.9404 0.5185 0.6495 0.5766 0.8643
0.0086 22.0 1980 0.9477 0.5409 0.6537 0.5920 0.8678
0.0052 23.0 2070 0.9441 0.5342 0.6568 0.5892 0.8736
0.0052 24.0 2160 0.9786 0.5373 0.6368 0.5829 0.8685
0.0052 25.0 2250 0.9510 0.5243 0.6463 0.5790 0.8722
0.0052 26.0 2340 0.9876 0.5261 0.6463 0.5801 0.8683
0.0052 27.0 2430 1.0049 0.5265 0.6484 0.5811 0.8652
0.0033 28.0 2520 1.0204 0.5347 0.6495 0.5865 0.8630
0.0033 29.0 2610 1.0027 0.5101 0.6411 0.5681 0.8647
0.0033 30.0 2700 1.0345 0.5243 0.6347 0.5743 0.8649
0.0033 31.0 2790 1.0199 0.5222 0.6316 0.5717 0.8663
0.0033 32.0 2880 1.0424 0.5243 0.6368 0.5751 0.8669
0.0033 33.0 2970 1.0341 0.5294 0.6453 0.5816 0.8662
0.0025 34.0 3060 1.0367 0.5419 0.6474 0.5899 0.8667
0.0025 35.0 3150 1.0629 0.5225 0.6484 0.5787 0.8660
0.0025 36.0 3240 1.0406 0.5227 0.6432 0.5767 0.8672
0.0025 37.0 3330 1.0168 0.5324 0.6495 0.5851 0.8701
0.0025 38.0 3420 1.0375 0.5332 0.6505 0.5861 0.8693
0.0015 39.0 3510 1.0921 0.5378 0.6442 0.5862 0.8649
0.0015 40.0 3600 1.0742 0.5330 0.6453 0.5838 0.8657
0.0015 41.0 3690 1.1234 0.5189 0.6347 0.5710 0.8619
0.0015 42.0 3780 1.0940 0.5407 0.6505 0.5905 0.8659
0.0015 43.0 3870 1.0612 0.5493 0.6505 0.5957 0.8704
0.0015 44.0 3960 1.0730 0.5445 0.6505 0.5928 0.8696
0.0008 45.0 4050 1.0834 0.5484 0.6558 0.5973 0.8675
0.0008 46.0 4140 1.1115 0.5487 0.6463 0.5935 0.8688
0.0008 47.0 4230 1.1153 0.5491 0.6474 0.5942 0.8661
0.0008 48.0 4320 1.1142 0.5456 0.6421 0.5899 0.8674
0.0008 49.0 4410 1.0922 0.5285 0.6537 0.5845 0.8704
0.0007 50.0 4500 1.0873 0.5448 0.6463 0.5912 0.8700
0.0007 51.0 4590 1.1141 0.5342 0.6337 0.5797 0.8686
0.0007 52.0 4680 1.2066 0.5116 0.6263 0.5632 0.8617
0.0007 53.0 4770 1.0850 0.5224 0.6379 0.5744 0.8704
0.0007 54.0 4860 1.1132 0.5308 0.6526 0.5855 0.8728
0.0007 55.0 4950 1.1540 0.5118 0.64 0.5688 0.8667
0.001 56.0 5040 1.1314 0.5314 0.6495 0.5846 0.8683
0.001 57.0 5130 1.0893 0.5456 0.6547 0.5952 0.8713
0.001 58.0 5220 1.0910 0.5354 0.6453 0.5852 0.8685
0.001 59.0 5310 1.1131 0.5527 0.6568 0.6003 0.8742
0.001 60.0 5400 1.1434 0.5339 0.6463 0.5848 0.8694
0.001 61.0 5490 1.1186 0.5313 0.6516 0.5853 0.8731
0.0007 62.0 5580 1.1584 0.5381 0.6474 0.5877 0.8684
0.0007 63.0 5670 1.1687 0.5429 0.6463 0.5901 0.8662
0.0007 64.0 5760 1.1296 0.5223 0.6421 0.5760 0.8756
0.0007 65.0 5850 1.1499 0.5345 0.6516 0.5873 0.8710
0.0007 66.0 5940 1.1771 0.5318 0.6516 0.5856 0.8713
0.0005 67.0 6030 1.1531 0.5219 0.6526 0.5800 0.8741
0.0005 68.0 6120 1.1781 0.5383 0.6358 0.5830 0.8713
0.0005 69.0 6210 1.1989 0.5164 0.6316 0.5682 0.8684
0.0005 70.0 6300 1.1986 0.5389 0.6495 0.5890 0.8695
0.0005 71.0 6390 1.1720 0.5603 0.6411 0.5979 0.8720
0.0005 72.0 6480 1.1699 0.5308 0.6432 0.5816 0.8725
0.0005 73.0 6570 1.1781 0.5541 0.6411 0.5944 0.8708
0.0005 74.0 6660 1.2327 0.5304 0.6337 0.5775 0.8664
0.0005 75.0 6750 1.2070 0.5537 0.6463 0.5964 0.8718
0.0005 76.0 6840 1.2032 0.5502 0.6463 0.5944 0.8728
0.0005 77.0 6930 1.2100 0.5525 0.6484 0.5966 0.8713
0.0003 78.0 7020 1.2171 0.5336 0.6442 0.5837 0.8715
0.0003 79.0 7110 1.2256 0.5241 0.64 0.5763 0.8704
0.0003 80.0 7200 1.2238 0.5323 0.6421 0.5821 0.8696
0.0003 81.0 7290 1.2219 0.5342 0.6326 0.5793 0.8693
0.0003 82.0 7380 1.2251 0.5325 0.6379 0.5805 0.8694
0.0003 83.0 7470 1.2187 0.5468 0.6389 0.5893 0.8681
0.0003 84.0 7560 1.2309 0.5365 0.6421 0.5846 0.8683
0.0003 85.0 7650 1.2445 0.5350 0.6432 0.5841 0.8676
0.0003 86.0 7740 1.2561 0.5288 0.6474 0.5821 0.8680
0.0003 87.0 7830 1.2567 0.5263 0.6421 0.5785 0.8678
0.0003 88.0 7920 1.2470 0.5346 0.6421 0.5835 0.8679
0.0002 89.0 8010 1.2458 0.5468 0.6453 0.5920 0.8684
0.0002 90.0 8100 1.2448 0.5484 0.6442 0.5924 0.8689
0.0002 91.0 8190 1.2439 0.5469 0.6379 0.5889 0.8684
0.0002 92.0 8280 1.2453 0.5338 0.64 0.5821 0.8695
0.0002 93.0 8370 1.2462 0.5315 0.64 0.5807 0.8692
0.0002 94.0 8460 1.2472 0.5328 0.6411 0.5819 0.8691
0.0002 95.0 8550 1.2502 0.5311 0.6379 0.5796 0.8686
0.0002 96.0 8640 1.2464 0.5330 0.6368 0.5803 0.8691
0.0002 97.0 8730 1.2526 0.5185 0.6337 0.5703 0.8681
0.0002 98.0 8820 1.2543 0.5167 0.6347 0.5697 0.8679
0.0002 99.0 8910 1.2544 0.5211 0.6358 0.5728 0.8680
0.0002 100.0 9000 1.2547 0.5202 0.6368 0.5726 0.8678

Framework versions