generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

bsc-bio-ehr-es-finetuned-clinais-augmented1

This model is a fine-tuned version of joheras/bsc-bio-ehr-es-finetuned-clinais on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 90 0.6246 0.2759 0.3485 0.3080 0.8286
No log 2.0 180 0.5890 0.3511 0.4520 0.3952 0.8402
No log 3.0 270 0.5973 0.3690 0.5322 0.4358 0.8539
No log 4.0 360 0.6676 0.3693 0.5713 0.4486 0.8554
No log 5.0 450 0.7173 0.4227 0.6061 0.4980 0.8544
0.3579 6.0 540 0.7854 0.4438 0.6082 0.5131 0.8552
0.3579 7.0 630 0.8437 0.4530 0.6103 0.5200 0.8522
0.3579 8.0 720 0.8716 0.4349 0.6103 0.5079 0.8513
0.3579 9.0 810 0.8868 0.4500 0.6030 0.5153 0.8544
0.3579 10.0 900 0.8917 0.4680 0.6251 0.5353 0.8574
0.3579 11.0 990 0.9175 0.4769 0.6336 0.5442 0.8548
0.0229 12.0 1080 0.9081 0.4767 0.6473 0.5490 0.8651
0.0229 13.0 1170 0.9692 0.4854 0.6336 0.5497 0.8532
0.0229 14.0 1260 0.9568 0.4947 0.6431 0.5592 0.8592
0.0229 15.0 1350 1.0028 0.4848 0.6241 0.5457 0.8505
0.0229 16.0 1440 1.0302 0.4821 0.6251 0.5444 0.8557
0.0076 17.0 1530 0.9892 0.4918 0.6325 0.5533 0.8584
0.0076 18.0 1620 1.0339 0.4755 0.6135 0.5357 0.8480
0.0076 19.0 1710 1.0066 0.4935 0.6399 0.5572 0.8570
0.0076 20.0 1800 1.0403 0.4959 0.6410 0.5592 0.8564
0.0076 21.0 1890 1.0374 0.4979 0.6336 0.5576 0.8561
0.0076 22.0 1980 1.0758 0.4821 0.6262 0.5448 0.8528
0.0044 23.0 2070 1.0818 0.4876 0.6230 0.5471 0.8524
0.0044 24.0 2160 1.0668 0.5096 0.6431 0.5686 0.8569
0.0044 25.0 2250 1.1033 0.4873 0.6294 0.5493 0.8541
0.0044 26.0 2340 1.0936 0.4880 0.6209 0.5465 0.8544
0.0044 27.0 2430 1.0802 0.4856 0.6399 0.5522 0.8583
0.0028 28.0 2520 1.1245 0.5034 0.6346 0.5614 0.8542
0.0028 29.0 2610 1.1293 0.4874 0.6336 0.5510 0.8521
0.0028 30.0 2700 1.0883 0.4984 0.6494 0.5640 0.8591
0.0028 31.0 2790 1.1434 0.5055 0.6315 0.5615 0.8565
0.0028 32.0 2880 1.1394 0.5041 0.6505 0.5680 0.8558
0.0028 33.0 2970 1.1473 0.5083 0.6452 0.5686 0.8550
0.0026 34.0 3060 1.2286 0.4996 0.6177 0.5524 0.8437
0.0026 35.0 3150 1.1982 0.4996 0.6251 0.5553 0.8521
0.0026 36.0 3240 1.1878 0.4987 0.6294 0.5565 0.8491
0.0026 37.0 3330 1.1633 0.4935 0.6399 0.5572 0.8511
0.0026 38.0 3420 1.1619 0.5097 0.6410 0.5678 0.8587
0.0021 39.0 3510 1.1438 0.5021 0.6420 0.5635 0.8575
0.0021 40.0 3600 1.1511 0.5087 0.6494 0.5705 0.8575
0.0021 41.0 3690 1.1631 0.5128 0.6558 0.5755 0.8576
0.0021 42.0 3780 1.1639 0.5137 0.6526 0.5749 0.8612
0.0021 43.0 3870 1.1946 0.5174 0.6452 0.5742 0.8568
0.0021 44.0 3960 1.1822 0.5132 0.6378 0.5687 0.8556
0.0012 45.0 4050 1.1533 0.5379 0.6441 0.5863 0.8617
0.0012 46.0 4140 1.1584 0.5242 0.6410 0.5767 0.8602
0.0012 47.0 4230 1.2217 0.5159 0.6357 0.5695 0.8567
0.0012 48.0 4320 1.2451 0.5265 0.6399 0.5777 0.8533
0.0012 49.0 4410 1.2191 0.5281 0.6357 0.5769 0.8563
0.0009 50.0 4500 1.2092 0.5320 0.6505 0.5853 0.8548
0.0009 51.0 4590 1.2168 0.5310 0.6431 0.5817 0.8607
0.0009 52.0 4680 1.2273 0.5068 0.6251 0.5598 0.8530
0.0009 53.0 4770 1.1903 0.5254 0.6441 0.5787 0.8618
0.0009 54.0 4860 1.1939 0.5354 0.6473 0.5860 0.8635
0.0009 55.0 4950 1.2311 0.5025 0.6357 0.5613 0.8581
0.001 56.0 5040 1.2224 0.5097 0.6389 0.5670 0.8606
0.001 57.0 5130 1.2298 0.5017 0.6410 0.5628 0.8586
0.001 58.0 5220 1.2278 0.5114 0.6389 0.5681 0.8584
0.001 59.0 5310 1.2703 0.5146 0.6505 0.5746 0.8586
0.001 60.0 5400 1.2709 0.5445 0.6336 0.5857 0.8549
0.001 61.0 5490 1.2691 0.5094 0.6283 0.5626 0.8554
0.0006 62.0 5580 1.2777 0.5076 0.6315 0.5628 0.8523
0.0006 63.0 5670 1.2472 0.5271 0.6357 0.5764 0.8563
0.0006 64.0 5760 1.2709 0.5220 0.6515 0.5796 0.8572
0.0006 65.0 5850 1.2792 0.5306 0.6410 0.5806 0.8613
0.0006 66.0 5940 1.2403 0.5058 0.6399 0.5650 0.8583
0.0005 67.0 6030 1.2778 0.5219 0.6410 0.5754 0.8564
0.0005 68.0 6120 1.3046 0.5431 0.6515 0.5924 0.8595
0.0005 69.0 6210 1.3002 0.5236 0.6452 0.5781 0.8547
0.0005 70.0 6300 1.3068 0.5179 0.6410 0.5729 0.8575
0.0005 71.0 6390 1.3123 0.5259 0.6431 0.5786 0.8572
0.0005 72.0 6480 1.3205 0.5395 0.6484 0.5890 0.8576
0.0004 73.0 6570 1.3281 0.5420 0.6473 0.5900 0.8578
0.0004 74.0 6660 1.3326 0.5381 0.6484 0.5881 0.8575
0.0004 75.0 6750 1.3532 0.5393 0.6452 0.5875 0.8553
0.0004 76.0 6840 1.3562 0.5215 0.6283 0.5699 0.8537
0.0004 77.0 6930 1.3385 0.5144 0.6420 0.5712 0.8569
0.0003 78.0 7020 1.3435 0.5303 0.6463 0.5826 0.8570
0.0003 79.0 7110 1.3402 0.5366 0.6505 0.5881 0.8568
0.0003 80.0 7200 1.3415 0.5469 0.6526 0.5951 0.8569
0.0003 81.0 7290 1.3335 0.5181 0.6505 0.5768 0.8578
0.0003 82.0 7380 1.3433 0.5258 0.6452 0.5794 0.8569
0.0003 83.0 7470 1.3351 0.5247 0.6515 0.5813 0.8566
0.0002 84.0 7560 1.3912 0.5187 0.6431 0.5743 0.8515
0.0002 85.0 7650 1.3507 0.5147 0.6463 0.5730 0.8566
0.0002 86.0 7740 1.3594 0.5221 0.6494 0.5788 0.8556
0.0002 87.0 7830 1.3647 0.5262 0.6463 0.5801 0.8547
0.0002 88.0 7920 1.3629 0.5263 0.6441 0.5793 0.8550
0.0002 89.0 8010 1.3769 0.5277 0.6441 0.5801 0.8535
0.0002 90.0 8100 1.3733 0.5268 0.6431 0.5792 0.8556
0.0002 91.0 8190 1.3648 0.5240 0.6452 0.5783 0.8562
0.0002 92.0 8280 1.3666 0.5228 0.6410 0.5759 0.8561
0.0002 93.0 8370 1.3577 0.5231 0.6452 0.5778 0.8580
0.0002 94.0 8460 1.3514 0.5340 0.6547 0.5882 0.8580
0.0002 95.0 8550 1.3564 0.5328 0.6526 0.5866 0.8582
0.0002 96.0 8640 1.3563 0.5342 0.6515 0.5871 0.8584
0.0002 97.0 8730 1.3567 0.5347 0.6505 0.5869 0.8584
0.0002 98.0 8820 1.3576 0.5347 0.6505 0.5869 0.8583
0.0002 99.0 8910 1.3583 0.5339 0.6494 0.5860 0.8582
0.0001 100.0 9000 1.3581 0.5320 0.6494 0.5849 0.8583

Framework versions