<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
model_v1_complete_training_wt_init_48_mini_emb_comp_frz
This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:
- Loss: 5.7952
- Accuracy: 0.1570
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 10
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10000
- num_epochs: 25
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
6.2393 | 0.25 | 30000 | 6.2308 | 0.1422 |
6.1905 | 0.49 | 60000 | 6.1865 | 0.1446 |
6.1603 | 0.74 | 90000 | 6.1535 | 0.1467 |
6.1282 | 0.98 | 120000 | 6.1308 | 0.1473 |
6.1155 | 1.23 | 150000 | 6.1108 | 0.1485 |
6.1032 | 1.47 | 180000 | 6.0956 | 0.1491 |
6.0866 | 1.72 | 210000 | 6.0824 | 0.1495 |
6.074 | 1.97 | 240000 | 6.0709 | 0.1497 |
6.0586 | 2.21 | 270000 | 6.0606 | 0.1500 |
6.0451 | 2.46 | 300000 | 6.0479 | 0.1506 |
6.0401 | 2.7 | 330000 | 6.0385 | 0.1507 |
6.027 | 2.95 | 360000 | 6.0274 | 0.1512 |
6.0198 | 3.2 | 390000 | 6.0148 | 0.1512 |
6.0023 | 3.44 | 420000 | 5.9970 | 0.1514 |
5.9882 | 3.69 | 450000 | 5.9782 | 0.1522 |
5.9756 | 3.93 | 480000 | 5.9632 | 0.1521 |
5.9587 | 4.18 | 510000 | 5.9471 | 0.1525 |
5.9449 | 4.42 | 540000 | 5.9315 | 0.1527 |
5.9212 | 4.67 | 570000 | 5.9157 | 0.1535 |
5.9201 | 4.92 | 600000 | 5.9062 | 0.1536 |
5.9125 | 5.16 | 630000 | 5.8994 | 0.1539 |
5.8982 | 5.41 | 660000 | 5.8930 | 0.1541 |
5.8933 | 5.65 | 690000 | 5.8847 | 0.1543 |
5.8844 | 5.9 | 720000 | 5.8792 | 0.1542 |
5.8848 | 6.14 | 750000 | 5.8728 | 0.1543 |
5.8787 | 6.39 | 780000 | 5.8678 | 0.1547 |
5.8748 | 6.64 | 810000 | 5.8629 | 0.1546 |
5.8665 | 6.88 | 840000 | 5.8576 | 0.1549 |
5.8637 | 7.13 | 870000 | 5.8513 | 0.1552 |
5.8553 | 7.37 | 900000 | 5.8465 | 0.1555 |
5.8539 | 7.62 | 930000 | 5.8423 | 0.1554 |
5.8479 | 7.87 | 960000 | 5.8378 | 0.1556 |
5.8446 | 8.11 | 990000 | 5.8329 | 0.1557 |
5.8411 | 8.36 | 1020000 | 5.8283 | 0.1559 |
5.8316 | 8.6 | 1050000 | 5.8240 | 0.1561 |
5.8254 | 8.85 | 1080000 | 5.8219 | 0.1559 |
5.8268 | 9.09 | 1110000 | 5.8158 | 0.1560 |
5.8257 | 9.34 | 1140000 | 5.8125 | 0.1562 |
5.8205 | 9.59 | 1170000 | 5.8076 | 0.1565 |
5.811 | 9.83 | 1200000 | 5.8025 | 0.1566 |
5.8123 | 10.08 | 1230000 | 5.7997 | 0.1567 |
5.8125 | 10.32 | 1260000 | 5.7952 | 0.1570 |
Framework versions
- Transformers 4.30.2
- Pytorch 1.14.0a0+410ce96
- Datasets 2.13.1
- Tokenizers 0.13.3