<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
EElayoutlmv3_jordyvl_rvl_cdip_100_examples_per_class_2023-09-26_ent_gates_exitloss
This model is a fine-tuned version of microsoft/layoutlmv3-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.1687
- Accuracy: 0.695
- Exit 0 Accuracy: 0.11
- Exit 1 Accuracy: 0.11
- Exit 2 Accuracy: 0.3625
- Exit 3 Accuracy: 0.6375
- Exit 4 Accuracy: 0.69
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 12
- total_train_batch_size: 192
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 60
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Exit 0 Accuracy | Exit 1 Accuracy | Exit 2 Accuracy | Exit 3 Accuracy | Exit 4 Accuracy |
---|---|---|---|---|---|---|---|---|---|
No log | 0.96 | 4 | 2.7393 | 0.1175 | 0.06 | 0.0925 | 0.085 | 0.0625 | 0.0625 |
No log | 1.96 | 8 | 2.6761 | 0.1625 | 0.07 | 0.0925 | 0.085 | 0.0625 | 0.0625 |
No log | 2.96 | 12 | 2.6336 | 0.1825 | 0.0775 | 0.1 | 0.09 | 0.0625 | 0.0625 |
No log | 3.96 | 16 | 2.6046 | 0.21 | 0.065 | 0.105 | 0.0775 | 0.0625 | 0.0625 |
No log | 4.96 | 20 | 2.6355 | 0.17 | 0.0675 | 0.105 | 0.06 | 0.115 | 0.0875 |
No log | 5.96 | 24 | 2.5487 | 0.1925 | 0.0725 | 0.1075 | 0.09 | 0.13 | 0.1075 |
No log | 6.96 | 28 | 2.4605 | 0.275 | 0.0875 | 0.1075 | 0.0925 | 0.1675 | 0.13 |
No log | 7.96 | 32 | 2.3986 | 0.2475 | 0.09 | 0.1125 | 0.0925 | 0.22 | 0.215 |
No log | 8.96 | 36 | 2.3182 | 0.3 | 0.1 | 0.1125 | 0.1075 | 0.2475 | 0.29 |
No log | 9.96 | 40 | 2.2072 | 0.35 | 0.1025 | 0.1125 | 0.1175 | 0.295 | 0.36 |
No log | 10.96 | 44 | 2.1187 | 0.425 | 0.1025 | 0.1125 | 0.1275 | 0.3175 | 0.4025 |
No log | 11.96 | 48 | 2.0086 | 0.455 | 0.0975 | 0.11 | 0.16 | 0.3675 | 0.4475 |
No log | 12.96 | 52 | 1.9037 | 0.4775 | 0.095 | 0.11 | 0.1725 | 0.4025 | 0.465 |
No log | 13.96 | 56 | 1.8088 | 0.515 | 0.0925 | 0.11 | 0.1875 | 0.4425 | 0.49 |
No log | 14.96 | 60 | 1.7198 | 0.5475 | 0.095 | 0.1075 | 0.2125 | 0.475 | 0.525 |
No log | 15.96 | 64 | 1.6502 | 0.5825 | 0.095 | 0.105 | 0.225 | 0.4825 | 0.5425 |
No log | 16.96 | 68 | 1.5650 | 0.58 | 0.0975 | 0.12 | 0.235 | 0.5175 | 0.5625 |
No log | 17.96 | 72 | 1.4998 | 0.6025 | 0.1025 | 0.1125 | 0.2475 | 0.5375 | 0.565 |
No log | 18.96 | 76 | 1.4608 | 0.6025 | 0.1075 | 0.11 | 0.275 | 0.5375 | 0.6025 |
No log | 19.96 | 80 | 1.3988 | 0.62 | 0.1075 | 0.11 | 0.285 | 0.545 | 0.6025 |
No log | 20.96 | 84 | 1.3833 | 0.6275 | 0.1075 | 0.11 | 0.2825 | 0.555 | 0.61 |
No log | 21.96 | 88 | 1.3400 | 0.6475 | 0.11 | 0.11 | 0.2875 | 0.57 | 0.62 |
No log | 22.96 | 92 | 1.3355 | 0.6425 | 0.1125 | 0.11 | 0.29 | 0.575 | 0.6375 |
No log | 23.96 | 96 | 1.2812 | 0.6525 | 0.11 | 0.11 | 0.295 | 0.585 | 0.635 |
No log | 24.96 | 100 | 1.2769 | 0.6425 | 0.11 | 0.11 | 0.31 | 0.585 | 0.6275 |
No log | 25.96 | 104 | 1.2410 | 0.665 | 0.11 | 0.1075 | 0.315 | 0.59 | 0.6375 |
No log | 26.96 | 108 | 1.2272 | 0.6725 | 0.1075 | 0.1075 | 0.32 | 0.595 | 0.64 |
No log | 27.96 | 112 | 1.2168 | 0.67 | 0.11 | 0.1075 | 0.3225 | 0.595 | 0.645 |
No log | 28.96 | 116 | 1.1919 | 0.675 | 0.11 | 0.1075 | 0.3325 | 0.595 | 0.64 |
No log | 29.96 | 120 | 1.1948 | 0.6825 | 0.11 | 0.1075 | 0.3375 | 0.6 | 0.655 |
No log | 30.96 | 124 | 1.1802 | 0.6875 | 0.1075 | 0.1075 | 0.3325 | 0.605 | 0.665 |
No log | 31.96 | 128 | 1.1939 | 0.68 | 0.11 | 0.1075 | 0.345 | 0.615 | 0.65 |
No log | 32.96 | 132 | 1.1690 | 0.6925 | 0.1075 | 0.1075 | 0.34 | 0.615 | 0.665 |
No log | 33.96 | 136 | 1.1763 | 0.68 | 0.105 | 0.1075 | 0.3475 | 0.6175 | 0.6525 |
No log | 34.96 | 140 | 1.1851 | 0.6875 | 0.105 | 0.1075 | 0.3525 | 0.615 | 0.6675 |
No log | 35.96 | 144 | 1.1574 | 0.6925 | 0.11 | 0.1075 | 0.355 | 0.62 | 0.6675 |
No log | 36.96 | 148 | 1.1618 | 0.68 | 0.1075 | 0.1075 | 0.36 | 0.62 | 0.665 |
No log | 37.96 | 152 | 1.1731 | 0.6825 | 0.105 | 0.1075 | 0.35 | 0.615 | 0.6575 |
No log | 38.96 | 156 | 1.1550 | 0.68 | 0.1075 | 0.1075 | 0.3425 | 0.6225 | 0.665 |
No log | 39.96 | 160 | 1.1553 | 0.7 | 0.11 | 0.1075 | 0.3475 | 0.625 | 0.675 |
No log | 40.96 | 164 | 1.1708 | 0.6875 | 0.1125 | 0.1075 | 0.355 | 0.6275 | 0.665 |
No log | 41.96 | 168 | 1.1366 | 0.7 | 0.115 | 0.1075 | 0.3525 | 0.63 | 0.68 |
No log | 42.96 | 172 | 1.1699 | 0.69 | 0.115 | 0.1075 | 0.3575 | 0.63 | 0.6825 |
No log | 43.96 | 176 | 1.1548 | 0.7025 | 0.1125 | 0.1075 | 0.3525 | 0.6325 | 0.6725 |
No log | 44.96 | 180 | 1.1628 | 0.6925 | 0.11 | 0.1075 | 0.3575 | 0.635 | 0.675 |
No log | 45.96 | 184 | 1.1620 | 0.695 | 0.11 | 0.1075 | 0.355 | 0.6325 | 0.6875 |
No log | 46.96 | 188 | 1.1668 | 0.695 | 0.1125 | 0.1075 | 0.3525 | 0.645 | 0.68 |
No log | 47.96 | 192 | 1.1595 | 0.6975 | 0.11 | 0.1075 | 0.3475 | 0.635 | 0.6875 |
No log | 48.96 | 196 | 1.1622 | 0.7025 | 0.11 | 0.1075 | 0.355 | 0.63 | 0.68 |
No log | 49.96 | 200 | 1.1779 | 0.695 | 0.1075 | 0.1075 | 0.3575 | 0.635 | 0.685 |
No log | 50.96 | 204 | 1.1656 | 0.695 | 0.11 | 0.1075 | 0.3525 | 0.635 | 0.685 |
No log | 51.96 | 208 | 1.1536 | 0.705 | 0.1075 | 0.1075 | 0.355 | 0.635 | 0.69 |
No log | 52.96 | 212 | 1.1675 | 0.7025 | 0.1075 | 0.11 | 0.355 | 0.635 | 0.6975 |
No log | 53.96 | 216 | 1.1775 | 0.6925 | 0.1075 | 0.11 | 0.3575 | 0.6325 | 0.6925 |
No log | 54.96 | 220 | 1.1690 | 0.7 | 0.1075 | 0.11 | 0.36 | 0.6375 | 0.685 |
No log | 55.96 | 224 | 1.1700 | 0.7 | 0.11 | 0.11 | 0.3625 | 0.64 | 0.69 |
No log | 56.96 | 228 | 1.1637 | 0.7025 | 0.11 | 0.11 | 0.3625 | 0.64 | 0.6875 |
No log | 57.96 | 232 | 1.1640 | 0.695 | 0.11 | 0.11 | 0.3625 | 0.6375 | 0.6875 |
No log | 58.96 | 236 | 1.1663 | 0.6975 | 0.11 | 0.11 | 0.3625 | 0.6375 | 0.6875 |
No log | 59.96 | 240 | 1.1687 | 0.695 | 0.11 | 0.11 | 0.3625 | 0.6375 | 0.69 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2