generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

clinico-bsc-bio-ehr-es

This model is a fine-tuned version of PlanTL-GOB-ES/bsc-bio-ehr-es on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 25 1.2185 0.0189 0.0359 0.0247 0.6197
No log 2.0 50 0.7442 0.1562 0.1975 0.1744 0.7996
No log 3.0 75 0.6502 0.2108 0.2640 0.2344 0.8180
No log 4.0 100 0.6404 0.3453 0.4572 0.3935 0.8258
No log 5.0 125 0.6131 0.3639 0.4657 0.4085 0.8303
No log 6.0 150 0.6123 0.3356 0.4256 0.3752 0.8341
No log 7.0 175 0.6093 0.3411 0.4498 0.3880 0.8370
No log 8.0 200 0.6198 0.3840 0.4931 0.4318 0.8379
No log 9.0 225 0.6490 0.3878 0.5037 0.4382 0.8378
No log 10.0 250 0.6653 0.3810 0.5005 0.4327 0.8371
No log 11.0 275 0.6456 0.3223 0.4847 0.3872 0.8387
No log 12.0 300 0.6475 0.3377 0.4847 0.3981 0.8474
No log 13.0 325 0.6620 0.4004 0.5734 0.4716 0.8506
No log 14.0 350 0.6798 0.3914 0.5649 0.4624 0.8533
No log 15.0 375 0.6880 0.3969 0.5671 0.4670 0.8520
No log 16.0 400 0.7012 0.4192 0.5913 0.4906 0.8551
No log 17.0 425 0.7224 0.4143 0.5924 0.4876 0.8517
No log 18.0 450 0.7510 0.4302 0.6051 0.5029 0.8553
No log 19.0 475 0.7388 0.4271 0.6030 0.5 0.8532
0.3652 20.0 500 0.7524 0.4374 0.6125 0.5103 0.8569
0.3652 21.0 525 0.7408 0.4427 0.6082 0.5125 0.8580
0.3652 22.0 550 0.7430 0.4448 0.6125 0.5153 0.8610
0.3652 23.0 575 0.7726 0.4193 0.6093 0.4968 0.8582
0.3652 24.0 600 0.7876 0.4316 0.6061 0.5042 0.8562
0.3652 25.0 625 0.7777 0.4620 0.6294 0.5329 0.8595
0.3652 26.0 650 0.8009 0.4521 0.6272 0.5254 0.8570
0.3652 27.0 675 0.8153 0.4583 0.6378 0.5333 0.8572
0.3652 28.0 700 0.8215 0.4611 0.6262 0.5311 0.8580
0.3652 29.0 725 0.8296 0.4699 0.6336 0.5396 0.8595
0.3652 30.0 750 0.8174 0.4597 0.6378 0.5343 0.8603
0.3652 31.0 775 0.8442 0.4765 0.6410 0.5466 0.8599
0.3652 32.0 800 0.8281 0.4646 0.6315 0.5354 0.8610
0.3652 33.0 825 0.8322 0.4583 0.6389 0.5337 0.8591
0.3652 34.0 850 0.8153 0.4559 0.6272 0.528 0.8623
0.3652 35.0 875 0.8529 0.4861 0.6294 0.5486 0.8589
0.3652 36.0 900 0.8826 0.4699 0.6272 0.5373 0.8559
0.3652 37.0 925 0.8856 0.4654 0.6325 0.5363 0.8571
0.3652 38.0 950 0.8983 0.4819 0.6315 0.5466 0.8560
0.3652 39.0 975 0.8723 0.4641 0.6272 0.5335 0.8556
0.0269 40.0 1000 0.8788 0.4662 0.6399 0.5394 0.8550
0.0269 41.0 1025 0.8952 0.4805 0.6378 0.5481 0.8611
0.0269 42.0 1050 0.8901 0.4657 0.6304 0.5357 0.8574
0.0269 43.0 1075 0.9015 0.4746 0.6410 0.5454 0.8574
0.0269 44.0 1100 0.8838 0.4655 0.6420 0.5397 0.8591
0.0269 45.0 1125 0.9093 0.4718 0.6441 0.5446 0.8598
0.0269 46.0 1150 0.9154 0.4826 0.6441 0.5518 0.8553
0.0269 47.0 1175 0.9214 0.4614 0.6315 0.5332 0.8538
0.0269 48.0 1200 0.9313 0.4639 0.6315 0.5349 0.8546
0.0269 49.0 1225 0.9137 0.4807 0.6431 0.5501 0.8582
0.0269 50.0 1250 0.9235 0.4939 0.6463 0.5599 0.8571
0.0269 51.0 1275 0.9263 0.4900 0.6441 0.5566 0.8580
0.0269 52.0 1300 0.9190 0.4787 0.6420 0.5485 0.8613
0.0269 53.0 1325 0.9159 0.4700 0.6441 0.5434 0.8616
0.0269 54.0 1350 0.9302 0.4806 0.6399 0.5489 0.8614
0.0269 55.0 1375 0.9391 0.4877 0.6515 0.5579 0.8581
0.0269 56.0 1400 0.9392 0.4959 0.6452 0.5608 0.8580
0.0269 57.0 1425 0.9444 0.4798 0.6410 0.5488 0.8570
0.0269 58.0 1450 0.9394 0.4777 0.6441 0.5486 0.8596
0.0269 59.0 1475 0.9562 0.4833 0.6420 0.5515 0.8586
0.0098 60.0 1500 0.9485 0.4801 0.6484 0.5517 0.8582
0.0098 61.0 1525 0.9521 0.4679 0.6463 0.5428 0.8582
0.0098 62.0 1550 0.9603 0.4759 0.6463 0.5481 0.8563
0.0098 63.0 1575 0.9663 0.4831 0.6473 0.5532 0.8561
0.0098 64.0 1600 0.9641 0.4780 0.6526 0.5518 0.8580
0.0098 65.0 1625 0.9607 0.4767 0.6494 0.5498 0.8606
0.0098 66.0 1650 0.9782 0.4849 0.6463 0.5541 0.8563
0.0098 67.0 1675 0.9806 0.4916 0.6484 0.5592 0.8562
0.0098 68.0 1700 0.9728 0.4889 0.6494 0.5578 0.8578
0.0098 69.0 1725 0.9766 0.4885 0.6494 0.5576 0.8584
0.0098 70.0 1750 0.9738 0.4862 0.6526 0.5573 0.8575
0.0098 71.0 1775 0.9788 0.4916 0.6505 0.56 0.8571
0.0098 72.0 1800 0.9845 0.4845 0.6452 0.5534 0.8563
0.0098 73.0 1825 0.9729 0.4876 0.6463 0.5559 0.8573
0.0098 74.0 1850 0.9854 0.4846 0.6494 0.5551 0.8569
0.0098 75.0 1875 0.9903 0.4885 0.6505 0.5580 0.8562
0.0098 76.0 1900 0.9825 0.4886 0.6558 0.5600 0.8568
0.0098 77.0 1925 0.9994 0.4876 0.6463 0.5559 0.8554
0.0098 78.0 1950 0.9922 0.4905 0.6515 0.5596 0.8546
0.0098 79.0 1975 1.0084 0.4928 0.6484 0.5600 0.8578
0.0057 80.0 2000 0.9931 0.4976 0.6526 0.5646 0.8580
0.0057 81.0 2025 0.9864 0.4826 0.6452 0.5522 0.8595
0.0057 82.0 2050 0.9929 0.4900 0.6484 0.5582 0.8595
0.0057 83.0 2075 0.9902 0.4916 0.6473 0.5588 0.8588
0.0057 84.0 2100 1.0021 0.4872 0.6431 0.5544 0.8573
0.0057 85.0 2125 1.0013 0.4964 0.6473 0.5619 0.8582
0.0057 86.0 2150 0.9814 0.4865 0.6484 0.5559 0.8625
0.0057 87.0 2175 0.9841 0.4932 0.6558 0.5630 0.8622
0.0057 88.0 2200 0.9888 0.4866 0.6515 0.5571 0.8610
0.0057 89.0 2225 0.9898 0.4924 0.6515 0.5609 0.8610
0.0057 90.0 2250 0.9860 0.4870 0.6526 0.5578 0.8607
0.0057 91.0 2275 0.9925 0.4912 0.6484 0.5589 0.8589
0.0057 92.0 2300 0.9904 0.4956 0.6536 0.5638 0.8599
0.0057 93.0 2325 0.9902 0.4980 0.6526 0.5649 0.8602
0.0057 94.0 2350 0.9925 0.5041 0.6547 0.5696 0.8602
0.0057 95.0 2375 0.9959 0.4897 0.6515 0.5591 0.8589
0.0057 96.0 2400 0.9951 0.4901 0.6505 0.5590 0.8591
0.0057 97.0 2425 0.9962 0.4924 0.6505 0.5605 0.8588
0.0057 98.0 2450 0.9972 0.5008 0.6505 0.5659 0.8585
0.0057 99.0 2475 0.9988 0.4920 0.6526 0.5611 0.8588
0.0045 100.0 2500 0.9988 0.4916 0.6526 0.5608 0.8586

Framework versions