<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
distilbert_sa_GLUE_Experiment_logit_kd_data_aug_mrpc_192
This model is a fine-tuned version of distilbert-base-uncased on the GLUE MRPC dataset. It achieves the following results on the evaluation set:
- Loss: 0.3961
- Accuracy: 0.8848
- F1: 0.9080
- Combined Score: 0.8964
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score |
---|---|---|---|---|---|---|
0.4524 | 1.0 | 980 | 0.4024 | 0.9779 | 0.9837 | 0.9808 |
0.4195 | 2.0 | 1960 | 0.3999 | 0.9779 | 0.9837 | 0.9808 |
0.416 | 3.0 | 2940 | 0.3984 | 0.9706 | 0.9781 | 0.9743 |
0.4145 | 4.0 | 3920 | 0.3981 | 0.9853 | 0.9892 | 0.9872 |
0.4133 | 5.0 | 4900 | 0.3983 | 0.9926 | 0.9946 | 0.9936 |
0.4128 | 6.0 | 5880 | 0.3982 | 0.9951 | 0.9964 | 0.9958 |
0.4124 | 7.0 | 6860 | 0.3968 | 0.9951 | 0.9964 | 0.9958 |
0.4121 | 8.0 | 7840 | 0.3968 | 0.9951 | 0.9964 | 0.9958 |
0.4118 | 9.0 | 8820 | 0.3969 | 0.9926 | 0.9946 | 0.9936 |
0.4115 | 10.0 | 9800 | 0.3967 | 0.9926 | 0.9946 | 0.9936 |
0.4114 | 11.0 | 10780 | 0.3967 | 0.9902 | 0.9928 | 0.9915 |
0.4113 | 12.0 | 11760 | 0.3970 | 0.9608 | 0.9705 | 0.9656 |
0.4112 | 13.0 | 12740 | 0.3967 | 0.9902 | 0.9928 | 0.9915 |
0.4112 | 14.0 | 13720 | 0.3967 | 0.9926 | 0.9946 | 0.9936 |
0.4111 | 15.0 | 14700 | 0.3966 | 0.9877 | 0.9910 | 0.9894 |
0.411 | 16.0 | 15680 | 0.3966 | 0.9779 | 0.9836 | 0.9808 |
0.4109 | 17.0 | 16660 | 0.3965 | 0.9681 | 0.9761 | 0.9721 |
0.4109 | 18.0 | 17640 | 0.3969 | 0.9608 | 0.9705 | 0.9656 |
0.4108 | 19.0 | 18620 | 0.3964 | 0.9804 | 0.9855 | 0.9829 |
0.4107 | 20.0 | 19600 | 0.3966 | 0.9681 | 0.9761 | 0.9721 |
0.4106 | 21.0 | 20580 | 0.3962 | 0.9926 | 0.9946 | 0.9936 |
0.4107 | 22.0 | 21560 | 0.3965 | 0.8627 | 0.8884 | 0.8756 |
0.4105 | 23.0 | 22540 | 0.3962 | 0.9755 | 0.9818 | 0.9786 |
0.4105 | 24.0 | 23520 | 0.3964 | 0.9118 | 0.9310 | 0.9214 |
0.4105 | 25.0 | 24500 | 0.3963 | 0.9167 | 0.9351 | 0.9259 |
0.4104 | 26.0 | 25480 | 0.3962 | 0.9142 | 0.9331 | 0.9236 |
0.4104 | 27.0 | 26460 | 0.3962 | 0.9069 | 0.9269 | 0.9169 |
0.4104 | 28.0 | 27440 | 0.3962 | 0.8701 | 0.8950 | 0.8826 |
0.4104 | 29.0 | 28420 | 0.3962 | 0.875 | 0.8994 | 0.8872 |
0.4104 | 30.0 | 29400 | 0.3961 | 0.8848 | 0.9080 | 0.8964 |
0.4103 | 31.0 | 30380 | 0.3961 | 0.8922 | 0.9144 | 0.9033 |
0.4102 | 32.0 | 31360 | 0.3961 | 0.8897 | 0.9123 | 0.9010 |
0.4102 | 33.0 | 32340 | 0.3961 | 0.8971 | 0.9186 | 0.9078 |
0.4102 | 34.0 | 33320 | 0.3961 | 0.8505 | 0.8773 | 0.8639 |
0.4103 | 35.0 | 34300 | 0.3962 | 0.8333 | 0.8612 | 0.8473 |
Framework versions
- Transformers 4.26.0
- Pytorch 1.14.0a0+410ce96
- Datasets 2.9.0
- Tokenizers 0.13.2