<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
Bio_ClinicalBERT_2e5_top10_20testset
This model is a fine-tuned version of emilyalsentzer/Bio_ClinicalBERT on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8119
- Precision Macro: 0.1337
- Recall Macro: 0.1573
- F1 Macro: 0.1429
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.3
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Precision Macro | Recall Macro | F1 Macro |
---|---|---|---|---|---|---|
No log | 1.0 | 482 | 0.7002 | 0.4518 | 0.0762 | 0.0609 |
1.7287 | 2.0 | 964 | 0.5923 | 0.3659 | 0.0905 | 0.0758 |
0.675 | 3.0 | 1446 | 0.5810 | 0.4061 | 0.1008 | 0.0997 |
0.5937 | 4.0 | 1928 | 0.5608 | 0.2037 | 0.1308 | 0.1079 |
0.5752 | 5.0 | 2410 | 0.5705 | 0.2182 | 0.1056 | 0.1033 |
0.5634 | 6.0 | 2892 | 0.5577 | 0.3136 | 0.1226 | 0.1075 |
0.5246 | 7.0 | 3374 | 0.5808 | 0.2421 | 0.1419 | 0.1270 |
0.5009 | 8.0 | 3856 | 0.5916 | 0.1602 | 0.1470 | 0.1207 |
0.4456 | 9.0 | 4338 | 0.6056 | 0.1354 | 0.1349 | 0.1257 |
0.4267 | 10.0 | 4820 | 0.6200 | 0.1474 | 0.1354 | 0.1294 |
0.3787 | 11.0 | 5302 | 0.6502 | 0.1334 | 0.1458 | 0.1349 |
0.3429 | 12.0 | 5784 | 0.6702 | 0.1445 | 0.1609 | 0.1470 |
0.3301 | 13.0 | 6266 | 0.7075 | 0.1420 | 0.1596 | 0.1446 |
0.295 | 14.0 | 6748 | 0.7270 | 0.1522 | 0.1584 | 0.1490 |
0.2693 | 15.0 | 7230 | 0.7330 | 0.1335 | 0.1525 | 0.1401 |
0.25 | 16.0 | 7712 | 0.7656 | 0.1387 | 0.1524 | 0.1417 |
0.2362 | 17.0 | 8194 | 0.7828 | 0.1411 | 0.1573 | 0.1468 |
0.218 | 18.0 | 8676 | 0.7967 | 0.1292 | 0.1500 | 0.1365 |
0.2124 | 19.0 | 9158 | 0.8119 | 0.1337 | 0.1573 | 0.1429 |
Framework versions
- Transformers 4.34.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.14.6.dev0
- Tokenizers 0.14.0