generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

scenario-kd-from-post-finetune-gold-silver-div-4-8000-data-smsa-model-haryoaw-sc

This model is a fine-tuned version of haryoaw/scenario-normal-finetune-clf-data-smsa-model-xlm-roberta-base on the smsa dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.4 100 2.9781 0.8365 0.7769
No log 0.8 200 2.2965 0.8587 0.8097
No log 1.2 300 2.7320 0.8444 0.8162
No log 1.6 400 2.2548 0.8627 0.8242
2.6579 2.0 500 1.9153 0.8865 0.8439
2.6579 2.4 600 1.8958 0.8905 0.8481
2.6579 2.8 700 1.8591 0.8921 0.8455
2.6579 3.2 800 1.7154 0.8944 0.8533
2.6579 3.6 900 1.6033 0.8937 0.8557
1.2089 4.0 1000 1.5786 0.8960 0.8487
1.2089 4.4 1100 1.5674 0.9016 0.8591
1.2089 4.8 1200 1.4204 0.9 0.8515
1.2089 5.2 1300 1.4645 0.9040 0.8650
1.2089 5.6 1400 1.5980 0.8960 0.8560
0.7739 6.0 1500 1.6732 0.8929 0.8519
0.7739 6.4 1600 1.3226 0.9063 0.8625
0.7739 6.8 1700 1.2441 0.9040 0.8600
0.7739 7.2 1800 1.3860 0.8944 0.8552
0.7739 7.6 1900 1.3320 0.8968 0.8485
0.6026 8.0 2000 1.3842 0.9056 0.8615
0.6026 8.4 2100 1.3240 0.9016 0.8621
0.6026 8.8 2200 1.2880 0.9063 0.8646
0.6026 9.2 2300 1.3428 0.9063 0.8639
0.6026 9.6 2400 1.3725 0.9016 0.8620
0.5354 10.0 2500 1.3696 0.9024 0.8598
0.5354 10.4 2600 1.3644 0.9032 0.8571
0.5354 10.8 2700 1.3189 0.9056 0.8705
0.5354 11.2 2800 1.4258 0.8984 0.8537
0.5354 11.6 2900 1.3107 0.9032 0.8614
0.4787 12.0 3000 1.4494 0.9024 0.8533
0.4787 12.4 3100 1.3701 0.9071 0.8561
0.4787 12.8 3200 1.1686 0.9111 0.8711
0.4787 13.2 3300 1.2147 0.9095 0.8711
0.4787 13.6 3400 1.2130 0.9056 0.8589
0.4245 14.0 3500 1.2426 0.9063 0.8623
0.4245 14.4 3600 1.1548 0.9095 0.8737
0.4245 14.8 3700 1.3100 0.8984 0.8591
0.4245 15.2 3800 1.2439 0.9 0.8572
0.4245 15.6 3900 1.2271 0.9048 0.8640
0.3945 16.0 4000 1.2431 0.9048 0.8603
0.3945 16.4 4100 1.2223 0.9127 0.8688
0.3945 16.8 4200 1.2058 0.9056 0.8694
0.3945 17.2 4300 1.1796 0.9024 0.8609
0.3945 17.6 4400 1.2383 0.9071 0.8672
0.3725 18.0 4500 1.2171 0.9040 0.8662
0.3725 18.4 4600 1.2637 0.9095 0.8721
0.3725 18.8 4700 1.1956 0.9040 0.8591
0.3725 19.2 4800 1.1177 0.9095 0.8674
0.3725 19.6 4900 1.0863 0.9175 0.8819
0.356 20.0 5000 1.0510 0.9143 0.8806
0.356 20.4 5100 1.1132 0.9008 0.8545
0.356 20.8 5200 1.2226 0.9063 0.8639
0.356 21.2 5300 1.1765 0.9048 0.8612
0.356 21.6 5400 1.1246 0.9063 0.8588
0.3333 22.0 5500 1.0851 0.9127 0.8790
0.3333 22.4 5600 1.0802 0.9111 0.8692
0.3333 22.8 5700 1.0567 0.9175 0.8793
0.3333 23.2 5800 1.0637 0.9119 0.8721
0.3333 23.6 5900 1.2241 0.9063 0.8612
0.315 24.0 6000 1.0900 0.9095 0.8689
0.315 24.4 6100 1.0571 0.9151 0.8771
0.315 24.8 6200 1.1068 0.9056 0.8611
0.315 25.2 6300 1.2289 0.9056 0.8663
0.315 25.6 6400 1.1325 0.9127 0.8724

Framework versions