generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

chope-fine-dishing-distilbert-base-uncased-finetuned-ner-v0.1

This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 0.46 50 0.8879 0.3658 0.2978 0.3283 0.5883
No log 0.92 100 0.7451 0.5553 0.6995 0.6191 0.7048
No log 1.38 150 0.7378 0.5351 0.6448 0.5849 0.7171
No log 1.83 200 0.8367 0.6037 0.6202 0.6119 0.7012
No log 2.29 250 0.7746 0.6328 0.6639 0.6480 0.7373
No log 2.75 300 0.8077 0.5 0.5956 0.5436 0.6939
No log 3.21 350 0.8416 0.5284 0.4836 0.5050 0.7012
No log 3.67 400 0.9220 0.5601 0.7131 0.6274 0.7323
No log 4.13 450 0.9337 0.5419 0.5301 0.5359 0.7113
0.2476 4.59 500 0.9225 0.6387 0.6667 0.6524 0.7323
0.2476 5.05 550 1.0376 0.5296 0.5383 0.5339 0.7033
0.2476 5.5 600 1.0138 0.5820 0.7760 0.6651 0.7496
0.2476 5.96 650 1.1675 0.6184 0.6421 0.6300 0.7366
0.2476 6.42 700 1.2386 0.5563 0.7022 0.6208 0.7272
0.2476 6.88 750 1.2480 0.6233 0.7322 0.6734 0.7330
0.2476 7.34 800 1.2026 0.6077 0.6858 0.6444 0.7287
0.2476 7.8 850 1.1666 0.6176 0.7678 0.6845 0.7482
0.2476 8.26 900 1.1741 0.6119 0.7842 0.6874 0.7518
0.2476 8.72 950 1.3172 0.5584 0.6667 0.6077 0.7214
0.0227 9.17 1000 1.3335 0.5868 0.7295 0.6504 0.7185
0.0227 9.63 1050 1.2987 0.6247 0.7459 0.6800 0.7352
0.0227 10.09 1100 1.4033 0.5391 0.5464 0.5427 0.7041
0.0227 10.55 1150 1.5544 0.5427 0.6940 0.6091 0.7113
0.0227 11.01 1200 1.5020 0.5771 0.5519 0.5642 0.7221
0.0227 11.47 1250 1.3234 0.5983 0.7486 0.6650 0.7381
0.0227 11.93 1300 1.4603 0.6197 0.7213 0.6667 0.7359
0.0227 12.39 1350 1.5133 0.5301 0.5301 0.5301 0.6975
0.0227 12.84 1400 1.4874 0.5671 0.7623 0.6503 0.7366
0.0227 13.3 1450 1.5313 0.5603 0.7240 0.6317 0.7279
0.0075 13.76 1500 1.4268 0.5895 0.6749 0.6293 0.7229
0.0075 14.22 1550 1.6733 0.5190 0.5219 0.5204 0.6939
0.0075 14.68 1600 1.5003 0.5749 0.7650 0.6565 0.7366
0.0075 15.14 1650 1.5747 0.6353 0.5902 0.6119 0.7294
0.0075 15.6 1700 1.4836 0.5484 0.5574 0.5528 0.7048
0.0075 16.06 1750 1.7085 0.5066 0.5273 0.5167 0.6932
0.0075 16.51 1800 1.6691 0.5669 0.5328 0.5493 0.7048
0.0075 16.97 1850 1.5524 0.534 0.7295 0.6166 0.7236
0.0075 17.43 1900 1.5616 0.5484 0.6038 0.5748 0.7156
0.0075 17.89 1950 1.5597 0.5622 0.6667 0.61 0.7192
0.0044 18.35 2000 1.4448 0.6106 0.7842 0.6866 0.7525
0.0044 18.81 2050 1.5741 0.5802 0.5137 0.5449 0.7055
0.0044 19.27 2100 1.6085 0.5842 0.6448 0.6130 0.7192
0.0044 19.72 2150 1.5787 0.6016 0.8087 0.6900 0.7547
0.0044 20.18 2200 1.6210 0.6004 0.8169 0.6921 0.7547
0.0044 20.64 2250 1.6739 0.5246 0.5246 0.5246 0.7026
0.0044 21.1 2300 1.7852 0.5618 0.5710 0.5664 0.6990
0.0044 21.56 2350 1.6344 0.5576 0.6612 0.605 0.7142
0.0044 22.02 2400 1.8115 0.5363 0.5847 0.5595 0.7033
0.0044 22.48 2450 1.8336 0.5294 0.6148 0.5689 0.6968
0.0034 22.94 2500 1.7901 0.5878 0.6038 0.5957 0.7048
0.0034 23.39 2550 1.7766 0.5615 0.6858 0.6175 0.7113
0.0034 23.85 2600 1.8159 0.5531 0.6831 0.6112 0.7084
0.0034 24.31 2650 1.8307 0.6075 0.6175 0.6125 0.7142
0.0034 24.77 2700 1.8326 0.5410 0.6667 0.5973 0.7055

Framework versions