<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
LayoutLMv3_maveriq_tobacco3482_2023-07-04_longer
This model is a fine-tuned version of microsoft/layoutlmv3-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4933
- Accuracy: 0.915
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 32
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 200
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 0.96 | 3 | 2.1414 | 0.285 |
No log | 1.96 | 6 | 2.0216 | 0.265 |
No log | 2.96 | 9 | 1.9444 | 0.265 |
No log | 3.96 | 12 | 1.8877 | 0.335 |
No log | 4.96 | 15 | 1.8160 | 0.315 |
No log | 5.96 | 18 | 1.7139 | 0.33 |
No log | 6.96 | 21 | 1.6301 | 0.36 |
No log | 7.96 | 24 | 1.5155 | 0.47 |
No log | 8.96 | 27 | 1.4009 | 0.555 |
No log | 9.96 | 30 | 1.3059 | 0.56 |
No log | 10.96 | 33 | 1.1493 | 0.67 |
No log | 11.96 | 36 | 1.0559 | 0.725 |
No log | 12.96 | 39 | 0.9505 | 0.75 |
No log | 13.96 | 42 | 0.8301 | 0.78 |
No log | 14.96 | 45 | 0.7531 | 0.775 |
No log | 15.96 | 48 | 0.7030 | 0.79 |
No log | 16.96 | 51 | 0.6294 | 0.82 |
No log | 17.96 | 54 | 0.5819 | 0.845 |
No log | 18.96 | 57 | 0.5381 | 0.87 |
No log | 19.96 | 60 | 0.4852 | 0.87 |
No log | 20.96 | 63 | 0.4581 | 0.91 |
No log | 21.96 | 66 | 0.4429 | 0.895 |
No log | 22.96 | 69 | 0.4065 | 0.915 |
No log | 23.96 | 72 | 0.4065 | 0.895 |
No log | 24.96 | 75 | 0.3598 | 0.915 |
No log | 25.96 | 78 | 0.3476 | 0.925 |
No log | 26.96 | 81 | 0.3413 | 0.93 |
No log | 27.96 | 84 | 0.3544 | 0.9 |
No log | 28.96 | 87 | 0.3239 | 0.93 |
No log | 29.96 | 90 | 0.3187 | 0.92 |
No log | 30.96 | 93 | 0.3090 | 0.92 |
No log | 31.96 | 96 | 0.3495 | 0.915 |
No log | 32.96 | 99 | 0.3075 | 0.93 |
No log | 33.96 | 102 | 0.3509 | 0.92 |
No log | 34.96 | 105 | 0.3499 | 0.925 |
No log | 35.96 | 108 | 0.3176 | 0.925 |
No log | 36.96 | 111 | 0.3260 | 0.915 |
No log | 37.96 | 114 | 0.3245 | 0.925 |
No log | 38.96 | 117 | 0.3139 | 0.92 |
No log | 39.96 | 120 | 0.3667 | 0.915 |
No log | 40.96 | 123 | 0.3410 | 0.925 |
No log | 41.96 | 126 | 0.3278 | 0.925 |
No log | 42.96 | 129 | 0.3518 | 0.925 |
No log | 43.96 | 132 | 0.3617 | 0.92 |
No log | 44.96 | 135 | 0.3642 | 0.93 |
No log | 45.96 | 138 | 0.3686 | 0.925 |
No log | 46.96 | 141 | 0.3784 | 0.92 |
No log | 47.96 | 144 | 0.3826 | 0.92 |
No log | 48.96 | 147 | 0.3734 | 0.925 |
No log | 49.96 | 150 | 0.3763 | 0.925 |
No log | 50.96 | 153 | 0.3931 | 0.92 |
No log | 51.96 | 156 | 0.3982 | 0.92 |
No log | 52.96 | 159 | 0.3960 | 0.92 |
No log | 53.96 | 162 | 0.3896 | 0.925 |
No log | 54.96 | 165 | 0.3917 | 0.925 |
No log | 55.96 | 168 | 0.4016 | 0.92 |
No log | 56.96 | 171 | 0.4098 | 0.92 |
No log | 57.96 | 174 | 0.4124 | 0.92 |
No log | 58.96 | 177 | 0.4127 | 0.92 |
No log | 59.96 | 180 | 0.4115 | 0.92 |
No log | 60.96 | 183 | 0.4134 | 0.92 |
No log | 61.96 | 186 | 0.4173 | 0.92 |
No log | 62.96 | 189 | 0.4209 | 0.92 |
No log | 63.96 | 192 | 0.4230 | 0.915 |
No log | 64.96 | 195 | 0.4259 | 0.915 |
No log | 65.96 | 198 | 0.4289 | 0.915 |
No log | 66.96 | 201 | 0.4318 | 0.915 |
No log | 67.96 | 204 | 0.4333 | 0.915 |
No log | 68.96 | 207 | 0.4325 | 0.915 |
No log | 69.96 | 210 | 0.4317 | 0.915 |
No log | 70.96 | 213 | 0.4336 | 0.915 |
No log | 71.96 | 216 | 0.4356 | 0.915 |
No log | 72.96 | 219 | 0.4372 | 0.915 |
No log | 73.96 | 222 | 0.4375 | 0.915 |
No log | 74.96 | 225 | 0.4381 | 0.915 |
No log | 75.96 | 228 | 0.4393 | 0.915 |
No log | 76.96 | 231 | 0.4418 | 0.915 |
No log | 77.96 | 234 | 0.4444 | 0.915 |
No log | 78.96 | 237 | 0.4470 | 0.915 |
No log | 79.96 | 240 | 0.4491 | 0.915 |
No log | 80.96 | 243 | 0.4492 | 0.915 |
No log | 81.96 | 246 | 0.4474 | 0.915 |
No log | 82.96 | 249 | 0.4443 | 0.915 |
No log | 83.96 | 252 | 0.4445 | 0.915 |
No log | 84.96 | 255 | 0.4477 | 0.915 |
No log | 85.96 | 258 | 0.4492 | 0.915 |
No log | 86.96 | 261 | 0.4501 | 0.915 |
No log | 87.96 | 264 | 0.4510 | 0.915 |
No log | 88.96 | 267 | 0.4520 | 0.915 |
No log | 89.96 | 270 | 0.4525 | 0.915 |
No log | 90.96 | 273 | 0.4531 | 0.915 |
No log | 91.96 | 276 | 0.4530 | 0.915 |
No log | 92.96 | 279 | 0.4518 | 0.915 |
No log | 93.96 | 282 | 0.4499 | 0.915 |
No log | 94.96 | 285 | 0.4485 | 0.915 |
No log | 95.96 | 288 | 0.4496 | 0.915 |
No log | 96.96 | 291 | 0.4525 | 0.915 |
No log | 97.96 | 294 | 0.4562 | 0.915 |
No log | 98.96 | 297 | 0.4596 | 0.915 |
No log | 99.96 | 300 | 0.4629 | 0.915 |
No log | 100.96 | 303 | 0.4639 | 0.915 |
No log | 101.96 | 306 | 0.4641 | 0.915 |
No log | 102.96 | 309 | 0.4630 | 0.915 |
No log | 103.96 | 312 | 0.4619 | 0.915 |
No log | 104.96 | 315 | 0.4624 | 0.915 |
No log | 105.96 | 318 | 0.4628 | 0.915 |
No log | 106.96 | 321 | 0.4635 | 0.915 |
No log | 107.96 | 324 | 0.4641 | 0.915 |
No log | 108.96 | 327 | 0.4650 | 0.915 |
No log | 109.96 | 330 | 0.4652 | 0.915 |
No log | 110.96 | 333 | 0.4664 | 0.915 |
No log | 111.96 | 336 | 0.4686 | 0.915 |
No log | 112.96 | 339 | 0.4718 | 0.915 |
No log | 113.96 | 342 | 0.4730 | 0.915 |
No log | 114.96 | 345 | 0.4719 | 0.915 |
No log | 115.96 | 348 | 0.4697 | 0.915 |
No log | 116.96 | 351 | 0.4676 | 0.915 |
No log | 117.96 | 354 | 0.4658 | 0.915 |
No log | 118.96 | 357 | 0.4655 | 0.915 |
No log | 119.96 | 360 | 0.4670 | 0.915 |
No log | 120.96 | 363 | 0.4695 | 0.915 |
No log | 121.96 | 366 | 0.4728 | 0.915 |
No log | 122.96 | 369 | 0.4757 | 0.915 |
No log | 123.96 | 372 | 0.4776 | 0.915 |
No log | 124.96 | 375 | 0.4782 | 0.915 |
No log | 125.96 | 378 | 0.4782 | 0.915 |
No log | 126.96 | 381 | 0.4770 | 0.915 |
No log | 127.96 | 384 | 0.4760 | 0.915 |
No log | 128.96 | 387 | 0.4754 | 0.915 |
No log | 129.96 | 390 | 0.4746 | 0.915 |
No log | 130.96 | 393 | 0.4745 | 0.915 |
No log | 131.96 | 396 | 0.4750 | 0.915 |
No log | 132.96 | 399 | 0.4756 | 0.915 |
No log | 133.96 | 402 | 0.4766 | 0.915 |
No log | 134.96 | 405 | 0.4777 | 0.915 |
No log | 135.96 | 408 | 0.4788 | 0.915 |
No log | 136.96 | 411 | 0.4799 | 0.915 |
No log | 137.96 | 414 | 0.4806 | 0.915 |
No log | 138.96 | 417 | 0.4806 | 0.915 |
No log | 139.96 | 420 | 0.4805 | 0.915 |
No log | 140.96 | 423 | 0.4796 | 0.915 |
No log | 141.96 | 426 | 0.4789 | 0.915 |
No log | 142.96 | 429 | 0.4785 | 0.915 |
No log | 143.96 | 432 | 0.4793 | 0.915 |
No log | 144.96 | 435 | 0.4805 | 0.915 |
No log | 145.96 | 438 | 0.4814 | 0.915 |
No log | 146.96 | 441 | 0.4822 | 0.915 |
No log | 147.96 | 444 | 0.4831 | 0.915 |
No log | 148.96 | 447 | 0.4840 | 0.915 |
No log | 149.96 | 450 | 0.4839 | 0.915 |
No log | 150.96 | 453 | 0.4839 | 0.915 |
No log | 151.96 | 456 | 0.4842 | 0.915 |
No log | 152.96 | 459 | 0.4843 | 0.915 |
No log | 153.96 | 462 | 0.4841 | 0.915 |
No log | 154.96 | 465 | 0.4838 | 0.915 |
No log | 155.96 | 468 | 0.4843 | 0.915 |
No log | 156.96 | 471 | 0.4848 | 0.915 |
No log | 157.96 | 474 | 0.4851 | 0.915 |
No log | 158.96 | 477 | 0.4853 | 0.915 |
No log | 159.96 | 480 | 0.4854 | 0.915 |
No log | 160.96 | 483 | 0.4857 | 0.915 |
No log | 161.96 | 486 | 0.4861 | 0.915 |
No log | 162.96 | 489 | 0.4867 | 0.915 |
No log | 163.96 | 492 | 0.4873 | 0.915 |
No log | 164.96 | 495 | 0.4884 | 0.915 |
No log | 165.96 | 498 | 0.4895 | 0.915 |
0.1894 | 166.96 | 501 | 0.4906 | 0.915 |
0.1894 | 167.96 | 504 | 0.4912 | 0.915 |
0.1894 | 168.96 | 507 | 0.4916 | 0.915 |
0.1894 | 169.96 | 510 | 0.4915 | 0.915 |
0.1894 | 170.96 | 513 | 0.4913 | 0.915 |
0.1894 | 171.96 | 516 | 0.4912 | 0.915 |
0.1894 | 172.96 | 519 | 0.4912 | 0.915 |
0.1894 | 173.96 | 522 | 0.4913 | 0.915 |
0.1894 | 174.96 | 525 | 0.4911 | 0.915 |
0.1894 | 175.96 | 528 | 0.4909 | 0.915 |
0.1894 | 176.96 | 531 | 0.4910 | 0.915 |
0.1894 | 177.96 | 534 | 0.4910 | 0.915 |
0.1894 | 178.96 | 537 | 0.4910 | 0.915 |
0.1894 | 179.96 | 540 | 0.4909 | 0.915 |
0.1894 | 180.96 | 543 | 0.4910 | 0.915 |
0.1894 | 181.96 | 546 | 0.4914 | 0.915 |
0.1894 | 182.96 | 549 | 0.4920 | 0.915 |
0.1894 | 183.96 | 552 | 0.4926 | 0.915 |
0.1894 | 184.96 | 555 | 0.4930 | 0.915 |
0.1894 | 185.96 | 558 | 0.4933 | 0.915 |
0.1894 | 186.96 | 561 | 0.4936 | 0.915 |
0.1894 | 187.96 | 564 | 0.4939 | 0.915 |
0.1894 | 188.96 | 567 | 0.4939 | 0.915 |
0.1894 | 189.96 | 570 | 0.4938 | 0.915 |
0.1894 | 190.96 | 573 | 0.4938 | 0.915 |
0.1894 | 191.96 | 576 | 0.4936 | 0.915 |
0.1894 | 192.96 | 579 | 0.4935 | 0.915 |
0.1894 | 193.96 | 582 | 0.4934 | 0.915 |
0.1894 | 194.96 | 585 | 0.4934 | 0.915 |
0.1894 | 195.96 | 588 | 0.4934 | 0.915 |
0.1894 | 196.96 | 591 | 0.4934 | 0.915 |
0.1894 | 197.96 | 594 | 0.4933 | 0.915 |
0.1894 | 198.96 | 597 | 0.4933 | 0.915 |
0.1894 | 199.96 | 600 | 0.4933 | 0.915 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2