<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
perioli_manifesti_v5.8.7
This model is a fine-tuned version of microsoft/layoutlmv3-base on the sroie dataset. It achieves the following results on the evaluation set:
- Loss: 0.0258
- Precision: 0.9632
- Recall: 0.9727
- F1: 0.9680
- Accuracy: 0.9956
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 6000
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
No log | 0.4 | 100 | 0.1526 | 0.7156 | 0.7662 | 0.7400 | 0.9540 |
No log | 0.81 | 200 | 0.0693 | 0.8493 | 0.9013 | 0.8745 | 0.9796 |
No log | 1.21 | 300 | 0.0490 | 0.8860 | 0.9297 | 0.9073 | 0.9864 |
No log | 1.62 | 400 | 0.0510 | 0.8810 | 0.9356 | 0.9075 | 0.9863 |
0.1511 | 2.02 | 500 | 0.0377 | 0.9174 | 0.9479 | 0.9324 | 0.9900 |
0.1511 | 2.43 | 600 | 0.0396 | 0.8996 | 0.9447 | 0.9216 | 0.9887 |
0.1511 | 2.83 | 700 | 0.0363 | 0.9142 | 0.9463 | 0.9299 | 0.9897 |
0.1511 | 3.24 | 800 | 0.0261 | 0.9503 | 0.9506 | 0.9504 | 0.9930 |
0.1511 | 3.64 | 900 | 0.0272 | 0.9489 | 0.9605 | 0.9547 | 0.9936 |
0.0289 | 4.05 | 1000 | 0.0283 | 0.9436 | 0.9641 | 0.9537 | 0.9935 |
0.0289 | 4.45 | 1100 | 0.0261 | 0.9500 | 0.9684 | 0.9591 | 0.9943 |
0.0289 | 4.86 | 1200 | 0.0250 | 0.9615 | 0.9656 | 0.9635 | 0.9949 |
0.0289 | 5.26 | 1300 | 0.0296 | 0.9437 | 0.9664 | 0.9549 | 0.9938 |
0.0289 | 5.67 | 1400 | 0.0268 | 0.9539 | 0.9648 | 0.9594 | 0.9942 |
0.0173 | 6.07 | 1500 | 0.0253 | 0.9526 | 0.9684 | 0.9604 | 0.9945 |
0.0173 | 6.48 | 1600 | 0.0257 | 0.9537 | 0.9688 | 0.9612 | 0.9948 |
0.0173 | 6.88 | 1700 | 0.0228 | 0.9566 | 0.9672 | 0.9619 | 0.9947 |
0.0173 | 7.29 | 1800 | 0.0273 | 0.9537 | 0.9688 | 0.9612 | 0.9946 |
0.0173 | 7.69 | 1900 | 0.0257 | 0.9548 | 0.9668 | 0.9608 | 0.9947 |
0.0117 | 8.1 | 2000 | 0.0256 | 0.9590 | 0.9688 | 0.9639 | 0.9949 |
0.0117 | 8.5 | 2100 | 0.0244 | 0.9591 | 0.9724 | 0.9657 | 0.9953 |
0.0117 | 8.91 | 2200 | 0.0257 | 0.9592 | 0.9743 | 0.9667 | 0.9954 |
0.0117 | 9.31 | 2300 | 0.0244 | 0.9613 | 0.9716 | 0.9664 | 0.9954 |
0.0117 | 9.72 | 2400 | 0.0240 | 0.9550 | 0.9641 | 0.9595 | 0.9945 |
0.0088 | 10.12 | 2500 | 0.0230 | 0.9628 | 0.9712 | 0.9670 | 0.9955 |
0.0088 | 10.53 | 2600 | 0.0231 | 0.9607 | 0.9739 | 0.9672 | 0.9955 |
0.0088 | 10.93 | 2700 | 0.0230 | 0.9625 | 0.9735 | 0.9680 | 0.9957 |
0.0088 | 11.34 | 2800 | 0.0253 | 0.9602 | 0.9731 | 0.9667 | 0.9954 |
0.0088 | 11.74 | 2900 | 0.0258 | 0.9618 | 0.9739 | 0.9678 | 0.9956 |
0.006 | 12.15 | 3000 | 0.0215 | 0.9656 | 0.9743 | 0.9699 | 0.9959 |
0.006 | 12.55 | 3100 | 0.0264 | 0.9509 | 0.9704 | 0.9605 | 0.9946 |
0.006 | 12.96 | 3200 | 0.0234 | 0.9564 | 0.9704 | 0.9633 | 0.9950 |
0.006 | 13.36 | 3300 | 0.0248 | 0.9569 | 0.9724 | 0.9645 | 0.9951 |
0.006 | 13.77 | 3400 | 0.0218 | 0.9637 | 0.9739 | 0.9688 | 0.9957 |
0.0052 | 14.17 | 3500 | 0.0204 | 0.9659 | 0.9743 | 0.9701 | 0.9959 |
0.0052 | 14.57 | 3600 | 0.0271 | 0.9565 | 0.9727 | 0.9646 | 0.9952 |
0.0052 | 14.98 | 3700 | 0.0274 | 0.9551 | 0.9735 | 0.9642 | 0.9953 |
0.0052 | 15.38 | 3800 | 0.0239 | 0.9603 | 0.9747 | 0.9675 | 0.9958 |
0.0052 | 15.79 | 3900 | 0.0251 | 0.9622 | 0.9759 | 0.9690 | 0.9959 |
0.0036 | 16.19 | 4000 | 0.0246 | 0.9565 | 0.9727 | 0.9646 | 0.9953 |
0.0036 | 16.6 | 4100 | 0.0263 | 0.9606 | 0.9735 | 0.9670 | 0.9956 |
0.0036 | 17.0 | 4200 | 0.0260 | 0.9607 | 0.9759 | 0.9683 | 0.9958 |
0.0036 | 17.41 | 4300 | 0.0258 | 0.9618 | 0.9751 | 0.9684 | 0.9958 |
0.0036 | 17.81 | 4400 | 0.0257 | 0.9606 | 0.9727 | 0.9666 | 0.9955 |
0.0031 | 18.22 | 4500 | 0.0252 | 0.9652 | 0.9743 | 0.9697 | 0.9959 |
0.0031 | 18.62 | 4600 | 0.0232 | 0.9679 | 0.9751 | 0.9715 | 0.9961 |
0.0031 | 19.03 | 4700 | 0.0268 | 0.9569 | 0.9731 | 0.9650 | 0.9952 |
0.0031 | 19.43 | 4800 | 0.0255 | 0.9606 | 0.9727 | 0.9666 | 0.9955 |
0.0031 | 19.84 | 4900 | 0.0263 | 0.9590 | 0.9708 | 0.9649 | 0.9954 |
0.0023 | 20.24 | 5000 | 0.0274 | 0.9591 | 0.9720 | 0.9655 | 0.9954 |
0.0023 | 20.65 | 5100 | 0.0251 | 0.9659 | 0.9735 | 0.9697 | 0.9959 |
0.0023 | 21.05 | 5200 | 0.0255 | 0.9670 | 0.9731 | 0.9701 | 0.9959 |
0.0023 | 21.46 | 5300 | 0.0251 | 0.9625 | 0.9727 | 0.9676 | 0.9956 |
0.0023 | 21.86 | 5400 | 0.0270 | 0.9610 | 0.9720 | 0.9664 | 0.9954 |
0.0018 | 22.27 | 5500 | 0.0270 | 0.9598 | 0.9724 | 0.9661 | 0.9954 |
0.0018 | 22.67 | 5600 | 0.0249 | 0.9651 | 0.9731 | 0.9691 | 0.9959 |
0.0018 | 23.08 | 5700 | 0.0260 | 0.9633 | 0.9731 | 0.9682 | 0.9956 |
0.0018 | 23.48 | 5800 | 0.0261 | 0.9610 | 0.9731 | 0.9670 | 0.9955 |
0.0018 | 23.89 | 5900 | 0.0259 | 0.9621 | 0.9724 | 0.9672 | 0.9955 |
0.0015 | 24.29 | 6000 | 0.0258 | 0.9632 | 0.9727 | 0.9680 | 0.9956 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.2.2
- Tokenizers 0.13.3