generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

clinico-roberta-biomedical-finetuned

This model is a fine-tuned version of joheras/roberta-base-biomedical-clinical-es-finetuned-clinais on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 25 1.2199 0.0033 0.0053 0.0040 0.5756
No log 2.0 50 0.7306 0.2031 0.2642 0.2296 0.8021
No log 3.0 75 0.6366 0.2967 0.3811 0.3336 0.8235
No log 4.0 100 0.6135 0.3497 0.4653 0.3993 0.8304
No log 5.0 125 0.5845 0.3421 0.4537 0.3900 0.8331
No log 6.0 150 0.5697 0.3307 0.4421 0.3784 0.8390
No log 7.0 175 0.5415 0.3211 0.4495 0.3746 0.8471
No log 8.0 200 0.5430 0.3589 0.5179 0.4240 0.8567
No log 9.0 225 0.5513 0.3342 0.5474 0.4150 0.8604
No log 10.0 250 0.5681 0.3769 0.5768 0.4559 0.8582
No log 11.0 275 0.5813 0.3756 0.5863 0.4579 0.8553
No log 12.0 300 0.6096 0.4181 0.5968 0.4918 0.8574
No log 13.0 325 0.6318 0.3978 0.6042 0.4797 0.8539
No log 14.0 350 0.6309 0.3892 0.5968 0.4711 0.8553
No log 15.0 375 0.6559 0.3987 0.5968 0.4781 0.8565
No log 16.0 400 0.6391 0.4275 0.6021 0.5 0.8560
No log 17.0 425 0.6812 0.4388 0.6074 0.5095 0.8584
No log 18.0 450 0.6901 0.4287 0.6137 0.5048 0.8563
No log 19.0 475 0.6834 0.4572 0.6074 0.5217 0.8581
0.3478 20.0 500 0.7050 0.4397 0.6179 0.5138 0.8573
0.3478 21.0 525 0.7004 0.4462 0.6242 0.5204 0.8591
0.3478 22.0 550 0.7038 0.4264 0.6126 0.5028 0.8599
0.3478 23.0 575 0.7384 0.4416 0.6284 0.5187 0.8576
0.3478 24.0 600 0.7197 0.4479 0.62 0.5201 0.8619
0.3478 25.0 625 0.7412 0.4381 0.6221 0.5141 0.8559
0.3478 26.0 650 0.7535 0.4489 0.6242 0.5222 0.8566
0.3478 27.0 675 0.7534 0.4657 0.6432 0.5402 0.8586
0.3478 28.0 700 0.7672 0.4525 0.6168 0.5220 0.8567
0.3478 29.0 725 0.7680 0.4637 0.6316 0.5348 0.8599
0.3478 30.0 750 0.7590 0.4611 0.6242 0.5304 0.8607
0.3478 31.0 775 0.7671 0.4732 0.6326 0.5414 0.8625
0.3478 32.0 800 0.7921 0.4674 0.6337 0.5380 0.8590
0.3478 33.0 825 0.8037 0.4828 0.6358 0.5488 0.8574
0.3478 34.0 850 0.8376 0.4644 0.6242 0.5326 0.8534
0.3478 35.0 875 0.8346 0.4815 0.6284 0.5452 0.8552
0.3478 36.0 900 0.8249 0.4750 0.6305 0.5418 0.8567
0.3478 37.0 925 0.8420 0.4580 0.6305 0.5306 0.8548
0.3478 38.0 950 0.8341 0.4773 0.6305 0.5433 0.8550
0.3478 39.0 975 0.8085 0.4792 0.6316 0.5450 0.8653
0.0274 40.0 1000 0.7954 0.4992 0.6474 0.5637 0.8651
0.0274 41.0 1025 0.8145 0.4923 0.6421 0.5573 0.8635
0.0274 42.0 1050 0.8290 0.4911 0.6368 0.5545 0.8610
0.0274 43.0 1075 0.8468 0.4821 0.6379 0.5492 0.8571
0.0274 44.0 1100 0.8274 0.4791 0.6389 0.5476 0.8625
0.0274 45.0 1125 0.8583 0.4831 0.6305 0.5470 0.8551
0.0274 46.0 1150 0.8420 0.4726 0.6347 0.5418 0.8589
0.0274 47.0 1175 0.8631 0.5029 0.64 0.5632 0.8564
0.0274 48.0 1200 0.8421 0.4911 0.64 0.5558 0.8617
0.0274 49.0 1225 0.8564 0.5071 0.6411 0.5662 0.8631
0.0274 50.0 1250 0.8659 0.4845 0.6263 0.5464 0.8603
0.0274 51.0 1275 0.8596 0.4860 0.64 0.5525 0.8632
0.0274 52.0 1300 0.8713 0.4856 0.6368 0.5510 0.8593
0.0274 53.0 1325 0.8888 0.4868 0.64 0.5530 0.8585
0.0274 54.0 1350 0.8591 0.4816 0.6337 0.5473 0.8610
0.0274 55.0 1375 0.8755 0.4996 0.64 0.5611 0.8615
0.0274 56.0 1400 0.8749 0.5095 0.6484 0.5706 0.8583
0.0274 57.0 1425 0.8867 0.5025 0.6453 0.5650 0.8580
0.0274 58.0 1450 0.8905 0.4947 0.6337 0.5556 0.8579
0.0274 59.0 1475 0.8911 0.4881 0.6495 0.5574 0.8596
0.0099 60.0 1500 0.9220 0.4914 0.6347 0.5540 0.8570
0.0099 61.0 1525 0.8687 0.4786 0.6368 0.5465 0.8594
0.0099 62.0 1550 0.9080 0.4906 0.6337 0.5531 0.8575
0.0099 63.0 1575 0.9004 0.4831 0.6337 0.5483 0.8583
0.0099 64.0 1600 0.8906 0.4778 0.6337 0.5448 0.8619
0.0099 65.0 1625 0.8870 0.4959 0.6368 0.5576 0.8618
0.0099 66.0 1650 0.8843 0.4851 0.6358 0.5503 0.8611
0.0099 67.0 1675 0.8923 0.4912 0.6453 0.5578 0.8618
0.0099 68.0 1700 0.8864 0.4898 0.6337 0.5525 0.8615
0.0099 69.0 1725 0.8974 0.4943 0.6411 0.5582 0.8615
0.0099 70.0 1750 0.8851 0.4821 0.6379 0.5492 0.8611
0.0099 71.0 1775 0.8958 0.4920 0.6453 0.5583 0.8593
0.0099 72.0 1800 0.8880 0.4988 0.6411 0.5610 0.8618
0.0099 73.0 1825 0.8959 0.4852 0.6379 0.5512 0.8606
0.0099 74.0 1850 0.9036 0.4773 0.6305 0.5433 0.8598
0.0099 75.0 1875 0.9031 0.4864 0.6389 0.5523 0.8615
0.0099 76.0 1900 0.9243 0.4907 0.6368 0.5543 0.8590
0.0099 77.0 1925 0.9285 0.4877 0.6453 0.5555 0.8590
0.0099 78.0 1950 0.9261 0.5074 0.6516 0.5705 0.8598
0.0099 79.0 1975 0.9374 0.5037 0.64 0.5637 0.8580
0.0061 80.0 2000 0.9165 0.5021 0.6316 0.5594 0.8621
0.0061 81.0 2025 0.9307 0.5162 0.6368 0.5702 0.8582
0.0061 82.0 2050 0.9369 0.4911 0.6358 0.5541 0.8574
0.0061 83.0 2075 0.9293 0.5191 0.6421 0.5741 0.8584
0.0061 84.0 2100 0.9187 0.5004 0.6453 0.5637 0.8629
0.0061 85.0 2125 0.9293 0.4927 0.6379 0.5560 0.8623
0.0061 86.0 2150 0.9200 0.5041 0.6453 0.5660 0.8634
0.0061 87.0 2175 0.9273 0.4992 0.6421 0.5617 0.8631
0.0061 88.0 2200 0.9325 0.5021 0.6442 0.5643 0.8623
0.0061 89.0 2225 0.9245 0.4844 0.6389 0.5511 0.8630
0.0061 90.0 2250 0.9291 0.4979 0.6368 0.5589 0.8593
0.0061 91.0 2275 0.9264 0.5083 0.6432 0.5678 0.8622
0.0061 92.0 2300 0.9283 0.5025 0.6411 0.5634 0.8619
0.0061 93.0 2325 0.9264 0.5008 0.6442 0.5635 0.8613
0.0061 94.0 2350 0.9205 0.5079 0.6463 0.5688 0.8626
0.0061 95.0 2375 0.9223 0.5121 0.6484 0.5722 0.8625
0.0061 96.0 2400 0.9244 0.5045 0.6421 0.5651 0.8620
0.0061 97.0 2425 0.9248 0.5062 0.6463 0.5677 0.8622
0.0061 98.0 2450 0.9277 0.5037 0.6453 0.5658 0.8621
0.0061 99.0 2475 0.9272 0.5083 0.6463 0.5690 0.8623
0.0046 100.0 2500 0.9272 0.5095 0.6463 0.5698 0.8623

Framework versions