<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
bert_base_ner_model_mimic_balnced
This model is a fine-tuned version of bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.9734
- Precision Macro: 0.1141
- Recall Macro: 0.1454
- F1 Macro: 0.1250
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.3
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Precision Macro | Recall Macro | F1 Macro |
---|---|---|---|---|---|---|
No log | 1.0 | 277 | 0.9175 | 0.4444 | 0.0 | 0.0 |
1.4043 | 2.0 | 554 | 0.7546 | 0.4891 | 0.0509 | 0.0466 |
1.4043 | 3.0 | 831 | 0.7103 | 0.2896 | 0.0965 | 0.0763 |
0.7411 | 4.0 | 1108 | 0.6992 | 0.3030 | 0.0822 | 0.0769 |
0.7411 | 5.0 | 1385 | 0.6921 | 0.1873 | 0.1126 | 0.0856 |
0.6928 | 6.0 | 1662 | 0.6909 | 0.3139 | 0.1080 | 0.0942 |
0.6928 | 7.0 | 1939 | 0.7000 | 0.2229 | 0.1069 | 0.1006 |
0.632 | 8.0 | 2216 | 0.7146 | 0.2037 | 0.1228 | 0.0975 |
0.632 | 9.0 | 2493 | 0.7579 | 0.2091 | 0.1353 | 0.1083 |
0.5608 | 10.0 | 2770 | 0.7847 | 0.2152 | 0.1393 | 0.1085 |
0.489 | 11.0 | 3047 | 0.7924 | 0.2093 | 0.1277 | 0.1098 |
0.489 | 12.0 | 3324 | 0.8229 | 0.1091 | 0.1383 | 0.1189 |
0.4278 | 13.0 | 3601 | 0.8665 | 0.1240 | 0.1427 | 0.1201 |
0.4278 | 14.0 | 3878 | 0.9144 | 0.1160 | 0.1421 | 0.1226 |
0.3716 | 15.0 | 4155 | 0.9400 | 0.1206 | 0.1515 | 0.1286 |
0.3716 | 16.0 | 4432 | 0.9327 | 0.1140 | 0.1506 | 0.1245 |
0.3297 | 17.0 | 4709 | 0.9688 | 0.1152 | 0.1462 | 0.1266 |
0.3297 | 18.0 | 4986 | 0.9734 | 0.1141 | 0.1454 | 0.1250 |
Framework versions
- Transformers 4.34.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.14.6.dev0
- Tokenizers 0.14.0