<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
perioli_manifesti_v5.7
This model is a fine-tuned version of microsoft/layoutlmv3-base on the sroie dataset. It achieves the following results on the evaluation set:
- F1: 0.9973
- Loss: 0.0153
- Overall Precision: 0.9793
- Overall Recall: 0.9793
- Overall F1: 0.9793
- Overall Accuracy: 0.9973
- Container id: {'precision': 0.9941176470588236, 'recall': 0.9712643678160919, 'f1': 0.9825581395348837, 'number': 174}
- Seal number: {'precision': 0.994413407821229, 'recall': 1.0, 'f1': 0.9971988795518207, 'number': 178}
- Container quantity: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 72}
- Container type: {'precision': 0.9941860465116279, 'recall': 0.9941860465116279, 'f1': 0.9941860465116279, 'number': 172}
- Tare: {'precision': 0.9712230215827338, 'recall': 0.9574468085106383, 'f1': 0.9642857142857143, 'number': 141}
- Package quantity: {'precision': 0.9651162790697675, 'recall': 0.9880952380952381, 'f1': 0.9764705882352942, 'number': 168}
- Weight: {'precision': 0.9781021897810219, 'recall': 0.9571428571428572, 'f1': 0.9675090252707581, 'number': 140}
- Others: {'precision': 0.971736204576043, 'recall': 0.9769959404600812, 'f1': 0.9743589743589745, 'number': 739}
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 4000
Training results
Training Loss | Epoch | Step | F1 | Validation Loss | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | Container id | Seal number | Container quantity | Container type | Tare | Package quantity | Weight | Others |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.1396 | 2.44 | 500 | 0.9945 | 0.0216 | 0.9652 | 0.9652 | 0.9652 | 0.9945 | {'precision': 0.9939024390243902, 'recall': 0.9367816091954023, 'f1': 0.9644970414201184, 'number': 174} | {'precision': 0.9833333333333333, 'recall': 0.9943820224719101, 'f1': 0.9888268156424581, 'number': 178} | {'precision': 0.935064935064935, 'recall': 1.0, 'f1': 0.9664429530201343, 'number': 72} | {'precision': 0.9710982658959537, 'recall': 0.9767441860465116, 'f1': 0.9739130434782609, 'number': 172} | {'precision': 0.9645390070921985, 'recall': 0.9645390070921985, 'f1': 0.9645390070921985, 'number': 141} | {'precision': 0.9636363636363636, 'recall': 0.9464285714285714, 'f1': 0.9549549549549549, 'number': 168} | {'precision': 0.9635036496350365, 'recall': 0.9428571428571428, 'f1': 0.9530685920577618, 'number': 140} | {'precision': 0.9571619812583668, 'recall': 0.9675236806495264, 'f1': 0.9623149394347242, 'number': 739} |
0.0166 | 4.88 | 1000 | 0.9964 | 0.0138 | 0.9737 | 0.9753 | 0.9745 | 0.9964 | {'precision': 0.9884393063583815, 'recall': 0.9827586206896551, 'f1': 0.9855907780979827, 'number': 174} | {'precision': 0.9943820224719101, 'recall': 0.9943820224719101, 'f1': 0.9943820224719101, 'number': 178} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 72} | {'precision': 0.9825581395348837, 'recall': 0.9825581395348837, 'f1': 0.9825581395348837, 'number': 172} | {'precision': 0.9710144927536232, 'recall': 0.950354609929078, 'f1': 0.9605734767025089, 'number': 141} | {'precision': 0.9649122807017544, 'recall': 0.9821428571428571, 'f1': 0.9734513274336283, 'number': 168} | {'precision': 0.9496402877697842, 'recall': 0.9428571428571428, 'f1': 0.9462365591397849, 'number': 140} | {'precision': 0.967741935483871, 'recall': 0.9742895805142084, 'f1': 0.9710047201618341, 'number': 739} |
0.0094 | 7.32 | 1500 | 0.9970 | 0.0136 | 0.9787 | 0.9765 | 0.9776 | 0.9970 | {'precision': 0.9940476190476191, 'recall': 0.9597701149425287, 'f1': 0.976608187134503, 'number': 174} | {'precision': 0.994413407821229, 'recall': 1.0, 'f1': 0.9971988795518207, 'number': 178} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 72} | {'precision': 0.9941520467836257, 'recall': 0.9883720930232558, 'f1': 0.9912536443148688, 'number': 172} | {'precision': 0.9710144927536232, 'recall': 0.950354609929078, 'f1': 0.9605734767025089, 'number': 141} | {'precision': 0.9705882352941176, 'recall': 0.9821428571428571, 'f1': 0.9763313609467456, 'number': 168} | {'precision': 0.9705882352941176, 'recall': 0.9428571428571428, 'f1': 0.9565217391304348, 'number': 140} | {'precision': 0.9705093833780161, 'recall': 0.979702300405954, 'f1': 0.9750841750841751, 'number': 739} |
0.0061 | 9.76 | 2000 | 0.9973 | 0.0120 | 0.9809 | 0.9781 | 0.9795 | 0.9973 | {'precision': 0.9941860465116279, 'recall': 0.9827586206896551, 'f1': 0.9884393063583815, 'number': 174} | {'precision': 0.994413407821229, 'recall': 1.0, 'f1': 0.9971988795518207, 'number': 178} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 72} | {'precision': 0.9826589595375722, 'recall': 0.9883720930232558, 'f1': 0.9855072463768114, 'number': 172} | {'precision': 0.9640287769784173, 'recall': 0.950354609929078, 'f1': 0.9571428571428572, 'number': 141} | {'precision': 0.9761904761904762, 'recall': 0.9761904761904762, 'f1': 0.9761904761904762, 'number': 168} | {'precision': 0.9777777777777777, 'recall': 0.9428571428571428, 'f1': 0.9600000000000001, 'number': 140} | {'precision': 0.9770580296896086, 'recall': 0.979702300405954, 'f1': 0.9783783783783784, 'number': 739} |
0.0041 | 12.2 | 2500 | 0.9976 | 0.0119 | 0.9831 | 0.9798 | 0.9815 | 0.9976 | {'precision': 0.9941520467836257, 'recall': 0.9770114942528736, 'f1': 0.9855072463768115, 'number': 174} | {'precision': 0.994413407821229, 'recall': 1.0, 'f1': 0.9971988795518207, 'number': 178} | {'precision': 1.0, 'recall': 0.9861111111111112, 'f1': 0.993006993006993, 'number': 72} | {'precision': 0.9883720930232558, 'recall': 0.9883720930232558, 'f1': 0.9883720930232558, 'number': 172} | {'precision': 0.9782608695652174, 'recall': 0.9574468085106383, 'f1': 0.967741935483871, 'number': 141} | {'precision': 0.9822485207100592, 'recall': 0.9880952380952381, 'f1': 0.9851632047477745, 'number': 168} | {'precision': 0.9781021897810219, 'recall': 0.9571428571428572, 'f1': 0.9675090252707581, 'number': 140} | {'precision': 0.9770580296896086, 'recall': 0.979702300405954, 'f1': 0.9783783783783784, 'number': 739} |
0.0031 | 14.63 | 3000 | 0.9966 | 0.0179 | 0.9732 | 0.9787 | 0.9760 | 0.9966 | {'precision': 0.9941520467836257, 'recall': 0.9770114942528736, 'f1': 0.9855072463768115, 'number': 174} | {'precision': 0.994413407821229, 'recall': 1.0, 'f1': 0.9971988795518207, 'number': 178} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 72} | {'precision': 0.9770114942528736, 'recall': 0.9883720930232558, 'f1': 0.9826589595375722, 'number': 172} | {'precision': 0.9782608695652174, 'recall': 0.9574468085106383, 'f1': 0.967741935483871, 'number': 141} | {'precision': 0.9540229885057471, 'recall': 0.9880952380952381, 'f1': 0.9707602339181286, 'number': 168} | {'precision': 0.9710144927536232, 'recall': 0.9571428571428572, 'f1': 0.9640287769784173, 'number': 140} | {'precision': 0.963903743315508, 'recall': 0.9756427604871448, 'f1': 0.9697377269670476, 'number': 739} |
0.0022 | 17.07 | 3500 | 0.9970 | 0.0162 | 0.9759 | 0.9770 | 0.9765 | 0.9970 | {'precision': 0.9941176470588236, 'recall': 0.9712643678160919, 'f1': 0.9825581395348837, 'number': 174} | {'precision': 0.994413407821229, 'recall': 1.0, 'f1': 0.9971988795518207, 'number': 178} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 72} | {'precision': 0.9826589595375722, 'recall': 0.9883720930232558, 'f1': 0.9855072463768114, 'number': 172} | {'precision': 0.9712230215827338, 'recall': 0.9574468085106383, 'f1': 0.9642857142857143, 'number': 141} | {'precision': 0.9595375722543352, 'recall': 0.9880952380952381, 'f1': 0.9736070381231672, 'number': 168} | {'precision': 0.9779411764705882, 'recall': 0.95, 'f1': 0.963768115942029, 'number': 140} | {'precision': 0.967741935483871, 'recall': 0.9742895805142084, 'f1': 0.9710047201618341, 'number': 739} |
0.0014 | 19.51 | 4000 | 0.9973 | 0.0153 | 0.9793 | 0.9793 | 0.9793 | 0.9973 | {'precision': 0.9941176470588236, 'recall': 0.9712643678160919, 'f1': 0.9825581395348837, 'number': 174} | {'precision': 0.994413407821229, 'recall': 1.0, 'f1': 0.9971988795518207, 'number': 178} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 72} | {'precision': 0.9941860465116279, 'recall': 0.9941860465116279, 'f1': 0.9941860465116279, 'number': 172} | {'precision': 0.9712230215827338, 'recall': 0.9574468085106383, 'f1': 0.9642857142857143, 'number': 141} | {'precision': 0.9651162790697675, 'recall': 0.9880952380952381, 'f1': 0.9764705882352942, 'number': 168} | {'precision': 0.9781021897810219, 'recall': 0.9571428571428572, 'f1': 0.9675090252707581, 'number': 140} | {'precision': 0.971736204576043, 'recall': 0.9769959404600812, 'f1': 0.9743589743589745, 'number': 739} |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.2.2
- Tokenizers 0.13.3