<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
perioli_manifesti_v5.8.2
This model is a fine-tuned version of microsoft/layoutlmv3-base on the sroie dataset. It achieves the following results on the evaluation set:
- Loss: 0.0300
- Precision: 0.9647
- Recall: 0.9743
- F1: 0.9695
- Accuracy: 0.9956
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 6000
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
No log | 0.4 | 100 | 0.1197 | 0.8324 | 0.7696 | 0.7998 | 0.9748 |
No log | 0.81 | 200 | 0.0613 | 0.8586 | 0.9123 | 0.8846 | 0.9858 |
No log | 1.21 | 300 | 0.0387 | 0.9068 | 0.9450 | 0.9255 | 0.9907 |
No log | 1.62 | 400 | 0.0238 | 0.9459 | 0.9602 | 0.9530 | 0.9942 |
0.1421 | 2.02 | 500 | 0.0224 | 0.9622 | 0.9673 | 0.9647 | 0.9953 |
0.1421 | 2.43 | 600 | 0.0235 | 0.9617 | 0.9684 | 0.9650 | 0.9952 |
0.1421 | 2.83 | 700 | 0.0222 | 0.9644 | 0.9673 | 0.9658 | 0.9950 |
0.1421 | 3.24 | 800 | 0.0271 | 0.9506 | 0.9673 | 0.9588 | 0.9945 |
0.1421 | 3.64 | 900 | 0.0273 | 0.9420 | 0.9591 | 0.9504 | 0.9936 |
0.0199 | 4.05 | 1000 | 0.0264 | 0.9511 | 0.9667 | 0.9588 | 0.9945 |
0.0199 | 4.45 | 1100 | 0.0219 | 0.9617 | 0.9684 | 0.9650 | 0.9953 |
0.0199 | 4.86 | 1200 | 0.0238 | 0.9572 | 0.9684 | 0.9628 | 0.9951 |
0.0199 | 5.26 | 1300 | 0.0259 | 0.9632 | 0.9655 | 0.9644 | 0.9953 |
0.0199 | 5.67 | 1400 | 0.0226 | 0.9554 | 0.9649 | 0.9601 | 0.9948 |
0.0116 | 6.07 | 1500 | 0.0205 | 0.9632 | 0.9655 | 0.9644 | 0.9952 |
0.0116 | 6.48 | 1600 | 0.0239 | 0.9674 | 0.9731 | 0.9703 | 0.9957 |
0.0116 | 6.88 | 1700 | 0.0250 | 0.96 | 0.9684 | 0.9642 | 0.9953 |
0.0116 | 7.29 | 1800 | 0.0236 | 0.9651 | 0.9713 | 0.9682 | 0.9957 |
0.0116 | 7.69 | 1900 | 0.0264 | 0.9589 | 0.9696 | 0.9642 | 0.9954 |
0.0089 | 8.1 | 2000 | 0.0274 | 0.9623 | 0.9702 | 0.9662 | 0.9950 |
0.0089 | 8.5 | 2100 | 0.0255 | 0.9685 | 0.9702 | 0.9693 | 0.9957 |
0.0089 | 8.91 | 2200 | 0.0295 | 0.9667 | 0.9690 | 0.9679 | 0.9953 |
0.0089 | 9.31 | 2300 | 0.0235 | 0.9703 | 0.9754 | 0.9729 | 0.9963 |
0.0089 | 9.72 | 2400 | 0.0255 | 0.9640 | 0.9719 | 0.9680 | 0.9955 |
0.0062 | 10.12 | 2500 | 0.0250 | 0.9630 | 0.9731 | 0.9680 | 0.9956 |
0.0062 | 10.53 | 2600 | 0.0276 | 0.9601 | 0.9708 | 0.9654 | 0.9955 |
0.0062 | 10.93 | 2700 | 0.0272 | 0.9680 | 0.9743 | 0.9711 | 0.9960 |
0.0062 | 11.34 | 2800 | 0.0291 | 0.9674 | 0.9713 | 0.9694 | 0.9955 |
0.0062 | 11.74 | 2900 | 0.0259 | 0.9697 | 0.9743 | 0.9720 | 0.9960 |
0.0041 | 12.15 | 3000 | 0.0248 | 0.9680 | 0.9725 | 0.9702 | 0.9960 |
0.0041 | 12.55 | 3100 | 0.0287 | 0.9640 | 0.9696 | 0.9668 | 0.9951 |
0.0041 | 12.96 | 3200 | 0.0300 | 0.9703 | 0.9731 | 0.9717 | 0.9957 |
0.0041 | 13.36 | 3300 | 0.0262 | 0.9679 | 0.9708 | 0.9693 | 0.9955 |
0.0041 | 13.77 | 3400 | 0.0265 | 0.9692 | 0.9749 | 0.9720 | 0.9957 |
0.0033 | 14.17 | 3500 | 0.0295 | 0.9662 | 0.9696 | 0.9679 | 0.9955 |
0.0033 | 14.57 | 3600 | 0.0301 | 0.9680 | 0.9743 | 0.9711 | 0.9958 |
0.0033 | 14.98 | 3700 | 0.0280 | 0.9686 | 0.9743 | 0.9714 | 0.9957 |
0.0033 | 15.38 | 3800 | 0.0300 | 0.9669 | 0.9749 | 0.9709 | 0.9958 |
0.0033 | 15.79 | 3900 | 0.0319 | 0.9617 | 0.9696 | 0.9656 | 0.9954 |
0.0018 | 16.19 | 4000 | 0.0263 | 0.9658 | 0.9737 | 0.9697 | 0.9958 |
0.0018 | 16.6 | 4100 | 0.0273 | 0.9692 | 0.9754 | 0.9723 | 0.9958 |
0.0018 | 17.0 | 4200 | 0.0273 | 0.9675 | 0.9749 | 0.9712 | 0.9962 |
0.0018 | 17.41 | 4300 | 0.0298 | 0.9669 | 0.9749 | 0.9709 | 0.9956 |
0.0018 | 17.81 | 4400 | 0.0318 | 0.9646 | 0.9725 | 0.9685 | 0.9956 |
0.0015 | 18.22 | 4500 | 0.0301 | 0.9669 | 0.9737 | 0.9703 | 0.9956 |
0.0015 | 18.62 | 4600 | 0.0302 | 0.9680 | 0.9737 | 0.9708 | 0.9956 |
0.0015 | 19.03 | 4700 | 0.0298 | 0.9664 | 0.9743 | 0.9703 | 0.9956 |
0.0015 | 19.43 | 4800 | 0.0286 | 0.9664 | 0.9749 | 0.9706 | 0.9958 |
0.0015 | 19.84 | 4900 | 0.0297 | 0.9658 | 0.9754 | 0.9706 | 0.9958 |
0.0009 | 20.24 | 5000 | 0.0261 | 0.9681 | 0.9754 | 0.9717 | 0.9962 |
0.0009 | 20.65 | 5100 | 0.0283 | 0.9653 | 0.9766 | 0.9709 | 0.9958 |
0.0009 | 21.05 | 5200 | 0.0302 | 0.9653 | 0.9766 | 0.9709 | 0.9958 |
0.0009 | 21.46 | 5300 | 0.0316 | 0.9642 | 0.9754 | 0.9698 | 0.9957 |
0.0009 | 21.86 | 5400 | 0.0297 | 0.9652 | 0.9737 | 0.9694 | 0.9957 |
0.0007 | 22.27 | 5500 | 0.0298 | 0.9652 | 0.9737 | 0.9694 | 0.9957 |
0.0007 | 22.67 | 5600 | 0.0297 | 0.9652 | 0.9737 | 0.9694 | 0.9955 |
0.0007 | 23.08 | 5700 | 0.0299 | 0.9647 | 0.9743 | 0.9695 | 0.9956 |
0.0007 | 23.48 | 5800 | 0.0300 | 0.9647 | 0.9749 | 0.9697 | 0.9957 |
0.0007 | 23.89 | 5900 | 0.0300 | 0.9647 | 0.9749 | 0.9697 | 0.9957 |
0.0006 | 24.29 | 6000 | 0.0300 | 0.9647 | 0.9743 | 0.9695 | 0.9956 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.2.2
- Tokenizers 0.13.3