<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
biobert-finetuned-pico
This model is a fine-tuned version of dmis-lab/biobert-base-cased-v1.2 on the pico-breast-cancer dataset. It achieves the following results on the evaluation set:
- Loss: 0.4646
- Precision: 0.6466
- Recall: 0.7182
- F1: 0.6805
- Accuracy: 0.9329
- Total-participants: 0.8914
- Intervention-participants: 0.8140
- Control-participants: 0.8424
- Age: 0.5357
- Eligibility: 0.4859
- Ethinicity: 0.5116
- Condition: 0.6049
- Location: 0.4932
- Intervention: 0.6078
- Control: 0.5150
- Outcome: 0.6102
- Outcome-measure: 0.6638
- Iv-bin-abs: 0.7895
- Cv-bin-abs: 0.8343
- Iv-bin-percent: 0.7873
- Cv-bin-percent: 0.7810
- Iv-cont-mean: 0.7125
- Cv-cont-mean: 0.7383
- Iv-cont-median: 0.7126
- Cv-cont-median: 0.7838
- Iv-cont-sd: 0.7333
- Cv-cont-sd: 0.8197
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 20
- eval_batch_size: 32
- seed: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | Total-participants | Intervention-participants | Control-participants | Age | Eligibility | Ethinicity | Condition | Location | Intervention | Control | Outcome | Outcome-measure | Iv-bin-abs | Cv-bin-abs | Iv-bin-percent | Cv-bin-percent | Iv-cont-mean | Cv-cont-mean | Iv-cont-median | Cv-cont-median | Iv-cont-sd | Cv-cont-sd |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 479 | 0.2514 | 0.4795 | 0.6284 | 0.5439 | 0.9130 | 0.8151 | 0.5783 | 0.6369 | 0.1926 | 0.3725 | 0.0 | 0.5443 | 0.4444 | 0.5976 | 0.4429 | 0.5423 | 0.6059 | 0.6033 | 0.5987 | 0.6287 | 0.5293 | 0.4231 | 0.3378 | 0.5476 | 0.6087 | 0.2857 | 0.0755 |
0.4045 | 2.0 | 958 | 0.2136 | 0.5636 | 0.6598 | 0.6079 | 0.9281 | 0.8406 | 0.6455 | 0.7101 | 0.4727 | 0.4483 | 0.3571 | 0.5359 | 0.4828 | 0.6085 | 0.5204 | 0.5662 | 0.6434 | 0.6479 | 0.6667 | 0.6784 | 0.6052 | 0.6329 | 0.6027 | 0.5400 | 0.5974 | 0.5574 | 0.6286 |
0.1933 | 3.0 | 1437 | 0.2236 | 0.5753 | 0.6632 | 0.6161 | 0.9259 | 0.8889 | 0.7453 | 0.7888 | 0.4068 | 0.4578 | 0.4286 | 0.6154 | 0.4918 | 0.5642 | 0.5035 | 0.5618 | 0.6491 | 0.6471 | 0.6569 | 0.6844 | 0.7284 | 0.5328 | 0.5906 | 0.5051 | 0.6 | 0.6667 | 0.6129 |
0.1374 | 4.0 | 1916 | 0.2507 | 0.6199 | 0.6859 | 0.6512 | 0.9308 | 0.8899 | 0.7754 | 0.8141 | 0.4094 | 0.4444 | 0.5714 | 0.6341 | 0.5455 | 0.6177 | 0.5191 | 0.5986 | 0.6738 | 0.7319 | 0.7439 | 0.7080 | 0.7193 | 0.6620 | 0.6338 | 0.6047 | 0.6923 | 0.6000 | 0.6429 |
0.0964 | 5.0 | 2395 | 0.2615 | 0.6143 | 0.7106 | 0.6590 | 0.9283 | 0.9009 | 0.7916 | 0.8125 | 0.4950 | 0.4250 | 0.5641 | 0.6623 | 0.6301 | 0.6025 | 0.5109 | 0.5905 | 0.6740 | 0.7586 | 0.7514 | 0.7788 | 0.7798 | 0.6627 | 0.6803 | 0.6316 | 0.6207 | 0.6667 | 0.7302 |
0.0694 | 6.0 | 2874 | 0.2939 | 0.6146 | 0.7123 | 0.6598 | 0.9287 | 0.9051 | 0.8170 | 0.8519 | 0.5246 | 0.4679 | 0.4889 | 0.5714 | 0.5205 | 0.5704 | 0.5047 | 0.5824 | 0.6826 | 0.7748 | 0.7821 | 0.7599 | 0.7692 | 0.6835 | 0.7403 | 0.5859 | 0.6835 | 0.7077 | 0.7869 |
0.0503 | 7.0 | 3353 | 0.3304 | 0.6323 | 0.6854 | 0.6578 | 0.9323 | 0.8952 | 0.7839 | 0.8312 | 0.5263 | 0.3831 | 0.4681 | 0.5780 | 0.5067 | 0.6122 | 0.5442 | 0.5850 | 0.6904 | 0.7364 | 0.7515 | 0.7379 | 0.7389 | 0.6901 | 0.6986 | 0.7021 | 0.7297 | 0.7541 | 0.7667 |
0.0336 | 8.0 | 3832 | 0.3486 | 0.6251 | 0.7089 | 0.6644 | 0.9293 | 0.9002 | 0.7948 | 0.8415 | 0.5217 | 0.4167 | 0.5238 | 0.6038 | 0.5641 | 0.5816 | 0.4969 | 0.5965 | 0.6798 | 0.7598 | 0.7861 | 0.7740 | 0.7719 | 0.725 | 0.7308 | 0.6744 | 0.6829 | 0.7333 | 0.8065 |
0.025 | 9.0 | 4311 | 0.3752 | 0.6387 | 0.7227 | 0.6781 | 0.9314 | 0.8875 | 0.8164 | 0.8378 | 0.5333 | 0.4936 | 0.6190 | 0.5952 | 0.6000 | 0.6080 | 0.5504 | 0.5952 | 0.6696 | 0.7387 | 0.7650 | 0.7840 | 0.7984 | 0.6829 | 0.7550 | 0.7126 | 0.7632 | 0.7812 | 0.7813 |
0.0176 | 10.0 | 4790 | 0.3921 | 0.6376 | 0.7126 | 0.6730 | 0.9317 | 0.8865 | 0.8229 | 0.8562 | 0.5437 | 0.4653 | 0.5641 | 0.6024 | 0.4225 | 0.6157 | 0.5011 | 0.5990 | 0.6681 | 0.7534 | 0.8156 | 0.7746 | 0.78 | 0.6748 | 0.7195 | 0.7059 | 0.7397 | 0.7719 | 0.8358 |
0.0125 | 11.0 | 5269 | 0.4271 | 0.6283 | 0.7154 | 0.6690 | 0.9299 | 0.8962 | 0.8097 | 0.8318 | 0.5254 | 0.4573 | 0.5238 | 0.5963 | 0.4722 | 0.6189 | 0.5219 | 0.6046 | 0.6553 | 0.7911 | 0.8046 | 0.7603 | 0.7547 | 0.6627 | 0.6933 | 0.6809 | 0.7467 | 0.7333 | 0.8000 |
0.009 | 12.0 | 5748 | 0.4350 | 0.6399 | 0.7142 | 0.6751 | 0.9317 | 0.8914 | 0.8076 | 0.8519 | 0.5577 | 0.4726 | 0.5217 | 0.6173 | 0.4722 | 0.5890 | 0.5088 | 0.6139 | 0.6509 | 0.7725 | 0.7976 | 0.7708 | 0.7722 | 0.7081 | 0.7297 | 0.7333 | 0.7945 | 0.7541 | 0.8136 |
0.0066 | 13.0 | 6227 | 0.4629 | 0.6537 | 0.7126 | 0.6819 | 0.9333 | 0.8923 | 0.8120 | 0.8466 | 0.5714 | 0.4718 | 0.5238 | 0.6154 | 0.5263 | 0.6062 | 0.5265 | 0.6096 | 0.6755 | 0.7841 | 0.8202 | 0.7847 | 0.775 | 0.7170 | 0.7248 | 0.7500 | 0.7838 | 0.7213 | 0.8197 |
0.0049 | 14.0 | 6706 | 0.4581 | 0.6520 | 0.7140 | 0.6816 | 0.9333 | 0.8923 | 0.8120 | 0.8450 | 0.5439 | 0.4948 | 0.5641 | 0.6087 | 0.4507 | 0.6037 | 0.5079 | 0.6164 | 0.6581 | 0.7822 | 0.8202 | 0.7698 | 0.7778 | 0.7215 | 0.7383 | 0.7273 | 0.8333 | 0.7333 | 0.8197 |
0.0041 | 15.0 | 7185 | 0.4646 | 0.6466 | 0.7182 | 0.6805 | 0.9329 | 0.8914 | 0.8140 | 0.8424 | 0.5357 | 0.4859 | 0.5116 | 0.6049 | 0.4932 | 0.6078 | 0.5150 | 0.6102 | 0.6638 | 0.7895 | 0.8343 | 0.7873 | 0.7810 | 0.7125 | 0.7383 | 0.7126 | 0.7838 | 0.7333 | 0.8197 |
Framework versions
- Transformers 4.29.2
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.13.2