<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
perioli_manifesti_v5.6_detailed
This model is a fine-tuned version of microsoft/layoutlmv3-base on the sroie dataset. It achieves the following results on the evaluation set:
- F1: 0.9930
- Loss: 0.0252
- Overall Precision: 0.9354
- Overall Recall: 0.9497
- Overall F1: 0.9425
- Overall Accuracy: 0.9930
- Container id: {'precision': 0.9908256880733946, 'recall': 0.9818181818181818, 'f1': 0.9863013698630138, 'number': 110}
- Seal number: {'precision': 0.9380530973451328, 'recall': 0.9724770642201835, 'f1': 0.954954954954955, 'number': 109}
- Container quantity: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 53}
- Container type: {'precision': 0.9894736842105263, 'recall': 0.9791666666666666, 'f1': 0.9842931937172775, 'number': 96}
- Tare: {'precision': 0.7816091954022989, 'recall': 0.7816091954022989, 'f1': 0.781609195402299, 'number': 87}
- Package quantity: {'precision': 0.9047619047619048, 'recall': 0.9405940594059405, 'f1': 0.9223300970873787, 'number': 101}
- Weight: {'precision': 1.0, 'recall': 0.9578947368421052, 'f1': 0.978494623655914, 'number': 95}
- Others: {'precision': 0.9266247379454927, 'recall': 0.9567099567099567, 'f1': 0.9414270500532481, 'number': 462}
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 3000
Training results
Training Loss | Epoch | Step | F1 | Validation Loss | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | Container id | Seal number | Container quantity | Container type | Tare | Package quantity | Weight | Others |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.17 | 250 | 0.9856 | 0.0573 | 0.8748 | 0.9039 | 0.8891 | 0.9856 | {'precision': 0.981651376146789, 'recall': 0.9727272727272728, 'f1': 0.9771689497716896, 'number': 110} | {'precision': 0.990909090909091, 'recall': 1.0, 'f1': 0.995433789954338, 'number': 109} | {'precision': 0.8412698412698413, 'recall': 1.0, 'f1': 0.9137931034482758, 'number': 53} | {'precision': 0.9175257731958762, 'recall': 0.9270833333333334, 'f1': 0.9222797927461139, 'number': 96} | {'precision': 0.7, 'recall': 0.6436781609195402, 'f1': 0.6706586826347305, 'number': 87} | {'precision': 0.8189655172413793, 'recall': 0.9405940594059405, 'f1': 0.8755760368663594, 'number': 101} | {'precision': 0.875, 'recall': 0.7368421052631579, 'f1': 0.7999999999999999, 'number': 95} | {'precision': 0.8626262626262626, 'recall': 0.9242424242424242, 'f1': 0.8923719958202716, 'number': 462} |
0.1682 | 2.34 | 500 | 0.9914 | 0.0346 | 0.9111 | 0.9479 | 0.9291 | 0.9914 | {'precision': 0.9908256880733946, 'recall': 0.9818181818181818, 'f1': 0.9863013698630138, 'number': 110} | {'precision': 0.9732142857142857, 'recall': 1.0, 'f1': 0.9864253393665159, 'number': 109} | {'precision': 0.9464285714285714, 'recall': 1.0, 'f1': 0.9724770642201834, 'number': 53} | {'precision': 0.9387755102040817, 'recall': 0.9583333333333334, 'f1': 0.9484536082474228, 'number': 96} | {'precision': 0.8, 'recall': 0.7816091954022989, 'f1': 0.7906976744186047, 'number': 87} | {'precision': 0.8761061946902655, 'recall': 0.9801980198019802, 'f1': 0.9252336448598131, 'number': 101} | {'precision': 0.9680851063829787, 'recall': 0.9578947368421052, 'f1': 0.962962962962963, 'number': 95} | {'precision': 0.8859470468431772, 'recall': 0.9415584415584416, 'f1': 0.912906610703043, 'number': 462} |
0.1682 | 3.5 | 750 | 0.9923 | 0.0324 | 0.9202 | 0.9533 | 0.9365 | 0.9923 | {'precision': 0.9908256880733946, 'recall': 0.9818181818181818, 'f1': 0.9863013698630138, 'number': 110} | {'precision': 0.9819819819819819, 'recall': 1.0, 'f1': 0.9909090909090909, 'number': 109} | {'precision': 0.9464285714285714, 'recall': 1.0, 'f1': 0.9724770642201834, 'number': 53} | {'precision': 0.9690721649484536, 'recall': 0.9791666666666666, 'f1': 0.9740932642487047, 'number': 96} | {'precision': 0.8023255813953488, 'recall': 0.7931034482758621, 'f1': 0.7976878612716762, 'number': 87} | {'precision': 0.8672566371681416, 'recall': 0.9702970297029703, 'f1': 0.9158878504672897, 'number': 101} | {'precision': 0.989010989010989, 'recall': 0.9473684210526315, 'f1': 0.967741935483871, 'number': 95} | {'precision': 0.8979591836734694, 'recall': 0.9523809523809523, 'f1': 0.9243697478991597, 'number': 462} |
0.0266 | 4.67 | 1000 | 0.9911 | 0.0361 | 0.9004 | 0.9506 | 0.9248 | 0.9911 | {'precision': 0.9908256880733946, 'recall': 0.9818181818181818, 'f1': 0.9863013698630138, 'number': 110} | {'precision': 0.9732142857142857, 'recall': 1.0, 'f1': 0.9864253393665159, 'number': 109} | {'precision': 0.8412698412698413, 'recall': 1.0, 'f1': 0.9137931034482758, 'number': 53} | {'precision': 0.92, 'recall': 0.9583333333333334, 'f1': 0.9387755102040817, 'number': 96} | {'precision': 0.8, 'recall': 0.8275862068965517, 'f1': 0.8135593220338982, 'number': 87} | {'precision': 0.8584070796460177, 'recall': 0.9603960396039604, 'f1': 0.9065420560747663, 'number': 101} | {'precision': 0.989247311827957, 'recall': 0.968421052631579, 'f1': 0.9787234042553192, 'number': 95} | {'precision': 0.8787878787878788, 'recall': 0.9415584415584416, 'f1': 0.9090909090909091, 'number': 462} |
0.0266 | 5.84 | 1250 | 0.9916 | 0.0336 | 0.9131 | 0.9443 | 0.9284 | 0.9916 | {'precision': 0.9908256880733946, 'recall': 0.9818181818181818, 'f1': 0.9863013698630138, 'number': 110} | {'precision': 0.9732142857142857, 'recall': 1.0, 'f1': 0.9864253393665159, 'number': 109} | {'precision': 0.9814814814814815, 'recall': 1.0, 'f1': 0.9906542056074767, 'number': 53} | {'precision': 0.9489795918367347, 'recall': 0.96875, 'f1': 0.9587628865979382, 'number': 96} | {'precision': 0.797752808988764, 'recall': 0.8160919540229885, 'f1': 0.8068181818181818, 'number': 87} | {'precision': 0.8771929824561403, 'recall': 0.9900990099009901, 'f1': 0.9302325581395348, 'number': 101} | {'precision': 0.9215686274509803, 'recall': 0.9894736842105263, 'f1': 0.9543147208121827, 'number': 95} | {'precision': 0.8942917547568711, 'recall': 0.9155844155844156, 'f1': 0.9048128342245989, 'number': 462} |
0.0161 | 7.01 | 1500 | 0.9944 | 0.0240 | 0.9339 | 0.9650 | 0.9492 | 0.9944 | {'precision': 0.9908256880733946, 'recall': 0.9818181818181818, 'f1': 0.9863013698630138, 'number': 110} | {'precision': 0.9819819819819819, 'recall': 1.0, 'f1': 0.9909090909090909, 'number': 109} | {'precision': 0.9814814814814815, 'recall': 1.0, 'f1': 0.9906542056074767, 'number': 53} | {'precision': 0.9690721649484536, 'recall': 0.9791666666666666, 'f1': 0.9740932642487047, 'number': 96} | {'precision': 0.8089887640449438, 'recall': 0.8275862068965517, 'f1': 0.8181818181818181, 'number': 87} | {'precision': 0.9174311926605505, 'recall': 0.9900990099009901, 'f1': 0.9523809523809524, 'number': 101} | {'precision': 0.9893617021276596, 'recall': 0.9789473684210527, 'f1': 0.9841269841269842, 'number': 95} | {'precision': 0.9137577002053389, 'recall': 0.9632034632034632, 'f1': 0.9378292939936775, 'number': 462} |
0.0161 | 8.18 | 1750 | 0.9940 | 0.0255 | 0.9314 | 0.9641 | 0.9475 | 0.9940 | {'precision': 0.9908256880733946, 'recall': 0.9818181818181818, 'f1': 0.9863013698630138, 'number': 110} | {'precision': 0.9819819819819819, 'recall': 1.0, 'f1': 0.9909090909090909, 'number': 109} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 53} | {'precision': 0.9791666666666666, 'recall': 0.9791666666666666, 'f1': 0.9791666666666666, 'number': 96} | {'precision': 0.8089887640449438, 'recall': 0.8275862068965517, 'f1': 0.8181818181818181, 'number': 87} | {'precision': 0.8849557522123894, 'recall': 0.9900990099009901, 'f1': 0.9345794392523366, 'number': 101} | {'precision': 1.0, 'recall': 0.9789473684210527, 'f1': 0.9893617021276596, 'number': 95} | {'precision': 0.9098360655737705, 'recall': 0.961038961038961, 'f1': 0.9347368421052632, 'number': 462} |
0.0107 | 9.35 | 2000 | 0.9930 | 0.0263 | 0.9254 | 0.9470 | 0.9361 | 0.9930 | {'precision': 0.9908256880733946, 'recall': 0.9818181818181818, 'f1': 0.9863013698630138, 'number': 110} | {'precision': 0.9385964912280702, 'recall': 0.981651376146789, 'f1': 0.9596412556053812, 'number': 109} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 53} | {'precision': 0.968421052631579, 'recall': 0.9583333333333334, 'f1': 0.9633507853403142, 'number': 96} | {'precision': 0.7954545454545454, 'recall': 0.8045977011494253, 'f1': 0.7999999999999999, 'number': 87} | {'precision': 0.8785046728971962, 'recall': 0.9306930693069307, 'f1': 0.9038461538461539, 'number': 101} | {'precision': 0.989010989010989, 'recall': 0.9473684210526315, 'f1': 0.967741935483871, 'number': 95} | {'precision': 0.9128630705394191, 'recall': 0.9523809523809523, 'f1': 0.9322033898305083, 'number': 462} |
0.0107 | 10.51 | 2250 | 0.9934 | 0.0212 | 0.9370 | 0.9488 | 0.9429 | 0.9934 | {'precision': 0.9908256880733946, 'recall': 0.9818181818181818, 'f1': 0.9863013698630138, 'number': 110} | {'precision': 0.9380530973451328, 'recall': 0.9724770642201835, 'f1': 0.954954954954955, 'number': 109} | {'precision': 1.0, 'recall': 0.9811320754716981, 'f1': 0.9904761904761905, 'number': 53} | {'precision': 0.9789473684210527, 'recall': 0.96875, 'f1': 0.9738219895287958, 'number': 96} | {'precision': 0.8045977011494253, 'recall': 0.8045977011494253, 'f1': 0.8045977011494253, 'number': 87} | {'precision': 0.9215686274509803, 'recall': 0.9306930693069307, 'f1': 0.9261083743842364, 'number': 101} | {'precision': 1.0, 'recall': 0.9578947368421052, 'f1': 0.978494623655914, 'number': 95} | {'precision': 0.9246861924686193, 'recall': 0.9567099567099567, 'f1': 0.9404255319148936, 'number': 462} |
0.0064 | 11.68 | 2500 | 0.9928 | 0.0293 | 0.9300 | 0.9551 | 0.9424 | 0.9928 | {'precision': 0.9908256880733946, 'recall': 0.9818181818181818, 'f1': 0.9863013698630138, 'number': 110} | {'precision': 0.9380530973451328, 'recall': 0.9724770642201835, 'f1': 0.954954954954955, 'number': 109} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 53} | {'precision': 0.9894736842105263, 'recall': 0.9791666666666666, 'f1': 0.9842931937172775, 'number': 96} | {'precision': 0.8089887640449438, 'recall': 0.8275862068965517, 'f1': 0.8181818181818181, 'number': 87} | {'precision': 0.8962264150943396, 'recall': 0.9405940594059405, 'f1': 0.9178743961352657, 'number': 101} | {'precision': 0.9893617021276596, 'recall': 0.9789473684210527, 'f1': 0.9841269841269842, 'number': 95} | {'precision': 0.9132231404958677, 'recall': 0.9567099567099567, 'f1': 0.9344608879492601, 'number': 462} |
0.0064 | 12.85 | 2750 | 0.9930 | 0.0252 | 0.9362 | 0.9497 | 0.9429 | 0.9930 | {'precision': 0.9908256880733946, 'recall': 0.9818181818181818, 'f1': 0.9863013698630138, 'number': 110} | {'precision': 0.9380530973451328, 'recall': 0.9724770642201835, 'f1': 0.954954954954955, 'number': 109} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 53} | {'precision': 0.9894736842105263, 'recall': 0.9791666666666666, 'f1': 0.9842931937172775, 'number': 96} | {'precision': 0.7727272727272727, 'recall': 0.7816091954022989, 'f1': 0.777142857142857, 'number': 87} | {'precision': 0.9134615384615384, 'recall': 0.9405940594059405, 'f1': 0.926829268292683, 'number': 101} | {'precision': 0.989010989010989, 'recall': 0.9473684210526315, 'f1': 0.967741935483871, 'number': 95} | {'precision': 0.930672268907563, 'recall': 0.9588744588744589, 'f1': 0.9445628997867804, 'number': 462} |
0.0057 | 14.02 | 3000 | 0.9930 | 0.0252 | 0.9354 | 0.9497 | 0.9425 | 0.9930 | {'precision': 0.9908256880733946, 'recall': 0.9818181818181818, 'f1': 0.9863013698630138, 'number': 110} | {'precision': 0.9380530973451328, 'recall': 0.9724770642201835, 'f1': 0.954954954954955, 'number': 109} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 53} | {'precision': 0.9894736842105263, 'recall': 0.9791666666666666, 'f1': 0.9842931937172775, 'number': 96} | {'precision': 0.7816091954022989, 'recall': 0.7816091954022989, 'f1': 0.781609195402299, 'number': 87} | {'precision': 0.9047619047619048, 'recall': 0.9405940594059405, 'f1': 0.9223300970873787, 'number': 101} | {'precision': 1.0, 'recall': 0.9578947368421052, 'f1': 0.978494623655914, 'number': 95} | {'precision': 0.9266247379454927, 'recall': 0.9567099567099567, 'f1': 0.9414270500532481, 'number': 462} |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.2.2
- Tokenizers 0.13.3