<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
model_v1_complete_training_wt_init_48_mini_emb_comp
This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:
- Loss: 5.7896
- Accuracy: 0.1573
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 10
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10000
- num_epochs: 25
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
6.6169 | 0.25 | 30000 | 6.6062 | 0.1130 |
6.3975 | 0.49 | 60000 | 6.3907 | 0.1321 |
6.2851 | 0.74 | 90000 | 6.2847 | 0.1387 |
6.2302 | 0.98 | 120000 | 6.2247 | 0.1429 |
6.1826 | 1.23 | 150000 | 6.1825 | 0.1449 |
6.1585 | 1.47 | 180000 | 6.1520 | 0.1470 |
6.1385 | 1.72 | 210000 | 6.1300 | 0.1476 |
6.1173 | 1.97 | 240000 | 6.1106 | 0.1483 |
6.0959 | 2.21 | 270000 | 6.0963 | 0.1487 |
6.0795 | 2.46 | 300000 | 6.0829 | 0.1497 |
6.0717 | 2.7 | 330000 | 6.0719 | 0.1498 |
6.0618 | 2.95 | 360000 | 6.0616 | 0.1494 |
6.0503 | 3.2 | 390000 | 6.0503 | 0.1505 |
6.0411 | 3.44 | 420000 | 6.0402 | 0.1507 |
6.0355 | 3.69 | 450000 | 6.0292 | 0.1510 |
6.021 | 3.93 | 480000 | 6.0159 | 0.1511 |
6.0021 | 4.18 | 510000 | 5.9952 | 0.1517 |
5.9782 | 4.42 | 540000 | 5.9764 | 0.1522 |
5.9729 | 4.67 | 570000 | 5.9616 | 0.1524 |
5.9542 | 4.92 | 600000 | 5.9461 | 0.1527 |
5.9348 | 5.16 | 630000 | 5.9301 | 0.1531 |
5.9259 | 5.41 | 660000 | 5.9173 | 0.1537 |
5.9184 | 5.65 | 690000 | 5.9074 | 0.1537 |
5.9093 | 5.9 | 720000 | 5.8970 | 0.1542 |
5.9003 | 6.14 | 750000 | 5.8903 | 0.1544 |
5.8983 | 6.39 | 780000 | 5.8825 | 0.1547 |
5.8847 | 6.64 | 810000 | 5.8758 | 0.1546 |
5.8749 | 6.88 | 840000 | 5.8717 | 0.1546 |
5.8789 | 7.13 | 870000 | 5.8664 | 0.1549 |
5.8698 | 7.37 | 900000 | 5.8607 | 0.1551 |
5.871 | 7.62 | 930000 | 5.8570 | 0.1553 |
5.8634 | 7.87 | 960000 | 5.8477 | 0.1556 |
5.8479 | 8.11 | 990000 | 5.8457 | 0.1551 |
5.8544 | 8.36 | 1020000 | 5.8387 | 0.1558 |
5.8531 | 8.6 | 1050000 | 5.8334 | 0.1559 |
5.846 | 8.85 | 1080000 | 5.8299 | 0.1563 |
5.8344 | 9.09 | 1110000 | 5.8249 | 0.1562 |
5.8382 | 9.34 | 1140000 | 5.8208 | 0.1564 |
5.8309 | 9.59 | 1170000 | 5.8154 | 0.1564 |
5.8207 | 9.83 | 1200000 | 5.8109 | 0.1566 |
5.8239 | 10.08 | 1230000 | 5.8069 | 0.1570 |
5.8194 | 10.32 | 1260000 | 5.8023 | 0.1571 |
5.8044 | 10.57 | 1290000 | 5.7987 | 0.1572 |
5.8129 | 10.81 | 1320000 | 5.7941 | 0.1570 |
5.8032 | 11.06 | 1350000 | 5.7896 | 0.1573 |
Framework versions
- Transformers 4.30.2
- Pytorch 1.14.0a0+410ce96
- Datasets 2.13.1
- Tokenizers 0.13.3