<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
1_9e-3_5_0.9
This model is a fine-tuned version of bert-large-uncased on the super_glue dataset. It achieves the following results on the evaluation set:
- Loss: 0.8703
- Accuracy: 0.7554
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.009
- train_batch_size: 16
- eval_batch_size: 8
- seed: 11
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100.0
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
4.0904 | 1.0 | 590 | 10.5019 | 0.6208 |
3.6556 | 2.0 | 1180 | 2.4871 | 0.6404 |
3.2159 | 3.0 | 1770 | 2.3558 | 0.6373 |
3.2501 | 4.0 | 2360 | 3.3686 | 0.4920 |
2.7738 | 5.0 | 2950 | 2.2391 | 0.6049 |
2.4147 | 6.0 | 3540 | 1.9342 | 0.6235 |
2.511 | 7.0 | 4130 | 1.7522 | 0.6526 |
2.1388 | 8.0 | 4720 | 1.5222 | 0.6865 |
2.0193 | 9.0 | 5310 | 1.5989 | 0.6706 |
2.1073 | 10.0 | 5900 | 2.5579 | 0.6838 |
2.2275 | 11.0 | 6490 | 1.6611 | 0.6982 |
1.8339 | 12.0 | 7080 | 1.3136 | 0.7092 |
2.0789 | 13.0 | 7670 | 1.3508 | 0.7162 |
1.8975 | 14.0 | 8260 | 1.2852 | 0.7159 |
1.7874 | 15.0 | 8850 | 2.5927 | 0.6703 |
1.7305 | 16.0 | 9440 | 1.1822 | 0.7159 |
1.6206 | 17.0 | 10030 | 1.2639 | 0.7183 |
1.524 | 18.0 | 10620 | 1.0813 | 0.7144 |
1.5502 | 19.0 | 11210 | 2.1527 | 0.6917 |
1.5332 | 20.0 | 11800 | 1.1110 | 0.7180 |
1.4154 | 21.0 | 12390 | 1.0680 | 0.7171 |
1.4948 | 22.0 | 12980 | 1.0978 | 0.7214 |
1.3143 | 23.0 | 13570 | 1.4256 | 0.6847 |
1.4259 | 24.0 | 14160 | 1.5151 | 0.6636 |
1.2758 | 25.0 | 14750 | 1.5291 | 0.6703 |
1.2543 | 26.0 | 15340 | 1.0773 | 0.7364 |
1.193 | 27.0 | 15930 | 1.2562 | 0.7061 |
1.245 | 28.0 | 16520 | 1.4397 | 0.7324 |
1.1221 | 29.0 | 17110 | 1.0291 | 0.7248 |
1.0916 | 30.0 | 17700 | 1.9141 | 0.7135 |
1.1892 | 31.0 | 18290 | 1.5064 | 0.7217 |
1.0759 | 32.0 | 18880 | 1.0584 | 0.7183 |
1.1571 | 33.0 | 19470 | 1.0418 | 0.7275 |
1.0692 | 34.0 | 20060 | 1.4428 | 0.6896 |
1.0653 | 35.0 | 20650 | 1.0209 | 0.7407 |
0.9819 | 36.0 | 21240 | 2.1233 | 0.7226 |
0.9464 | 37.0 | 21830 | 1.1958 | 0.7425 |
0.9531 | 38.0 | 22420 | 0.9840 | 0.7297 |
0.9324 | 39.0 | 23010 | 1.1269 | 0.7480 |
0.9409 | 40.0 | 23600 | 1.4269 | 0.7428 |
0.9007 | 41.0 | 24190 | 1.4096 | 0.7474 |
0.8971 | 42.0 | 24780 | 0.9441 | 0.7361 |
0.8515 | 43.0 | 25370 | 0.9103 | 0.7419 |
0.8691 | 44.0 | 25960 | 0.9694 | 0.7489 |
0.8477 | 45.0 | 26550 | 1.1691 | 0.7202 |
0.7806 | 46.0 | 27140 | 1.2032 | 0.7009 |
0.8126 | 47.0 | 27730 | 1.6263 | 0.7385 |
0.8065 | 48.0 | 28320 | 0.9302 | 0.7474 |
0.824 | 49.0 | 28910 | 1.0026 | 0.7260 |
0.8038 | 50.0 | 29500 | 0.9303 | 0.7352 |
0.7994 | 51.0 | 30090 | 1.0096 | 0.7382 |
0.7564 | 52.0 | 30680 | 0.9414 | 0.7508 |
0.7723 | 53.0 | 31270 | 0.9393 | 0.7520 |
0.7567 | 54.0 | 31860 | 1.0204 | 0.7609 |
0.7498 | 55.0 | 32450 | 0.9284 | 0.7373 |
0.7632 | 56.0 | 33040 | 0.9513 | 0.7523 |
0.706 | 57.0 | 33630 | 1.0413 | 0.7468 |
0.7099 | 58.0 | 34220 | 1.2300 | 0.7529 |
0.7333 | 59.0 | 34810 | 0.9594 | 0.7563 |
0.7144 | 60.0 | 35400 | 0.9062 | 0.7413 |
0.6771 | 61.0 | 35990 | 1.0458 | 0.7508 |
0.7351 | 62.0 | 36580 | 1.3680 | 0.7505 |
0.6831 | 63.0 | 37170 | 0.9765 | 0.7333 |
0.6727 | 64.0 | 37760 | 0.9768 | 0.7498 |
0.684 | 65.0 | 38350 | 0.9329 | 0.7526 |
0.6733 | 66.0 | 38940 | 0.9962 | 0.7498 |
0.6491 | 67.0 | 39530 | 0.9764 | 0.7346 |
0.6827 | 68.0 | 40120 | 0.9691 | 0.7327 |
0.6617 | 69.0 | 40710 | 1.2971 | 0.7563 |
0.6646 | 70.0 | 41300 | 1.0306 | 0.7578 |
0.6536 | 71.0 | 41890 | 0.8994 | 0.7413 |
0.6508 | 72.0 | 42480 | 0.9920 | 0.7550 |
0.6131 | 73.0 | 43070 | 0.9060 | 0.7394 |
0.6193 | 74.0 | 43660 | 1.1591 | 0.7602 |
0.6199 | 75.0 | 44250 | 0.9786 | 0.7538 |
0.6486 | 76.0 | 44840 | 0.8789 | 0.7459 |
0.6233 | 77.0 | 45430 | 0.8793 | 0.7505 |
0.613 | 78.0 | 46020 | 1.0248 | 0.7544 |
0.607 | 79.0 | 46610 | 0.9023 | 0.7468 |
0.5907 | 80.0 | 47200 | 1.0124 | 0.7547 |
0.5903 | 81.0 | 47790 | 1.0238 | 0.7587 |
0.5902 | 82.0 | 48380 | 0.8739 | 0.7486 |
0.5658 | 83.0 | 48970 | 0.8797 | 0.7508 |
0.573 | 84.0 | 49560 | 1.0304 | 0.7587 |
0.5922 | 85.0 | 50150 | 0.9199 | 0.7367 |
0.5683 | 86.0 | 50740 | 0.9492 | 0.7569 |
0.5719 | 87.0 | 51330 | 0.9078 | 0.7474 |
0.562 | 88.0 | 51920 | 0.8964 | 0.7529 |
0.5567 | 89.0 | 52510 | 0.9042 | 0.7532 |
0.5504 | 90.0 | 53100 | 0.8721 | 0.7526 |
0.5581 | 91.0 | 53690 | 0.8808 | 0.7502 |
0.5602 | 92.0 | 54280 | 0.8708 | 0.7462 |
0.5441 | 93.0 | 54870 | 0.8814 | 0.7547 |
0.5592 | 94.0 | 55460 | 0.8771 | 0.7498 |
0.5445 | 95.0 | 56050 | 0.9019 | 0.7535 |
0.5497 | 96.0 | 56640 | 0.8962 | 0.7532 |
0.5356 | 97.0 | 57230 | 0.8621 | 0.7523 |
0.521 | 98.0 | 57820 | 0.8742 | 0.7532 |
0.5241 | 99.0 | 58410 | 0.8712 | 0.7547 |
0.5292 | 100.0 | 59000 | 0.8703 | 0.7554 |
Framework versions
- Transformers 4.30.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3