generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

DatasetSinergyRhenus

This model is a fine-tuned version of microsoft/layoutlmv3-large on the sroie dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.19 100 0.3083 0.6238 0.7709 0.6896 0.8937
No log 2.38 200 0.1983 0.7691 0.8244 0.7958 0.9281
No log 3.57 300 0.2468 0.7690 0.8462 0.8057 0.9213
No log 4.76 400 0.1565 0.8412 0.8595 0.8503 0.9614
0.2937 5.95 500 0.1671 0.8238 0.8445 0.8340 0.9577
0.2937 7.14 600 0.1665 0.8440 0.8595 0.8517 0.9594
0.2937 8.33 700 0.1679 0.8571 0.8528 0.8550 0.9628
0.2937 9.52 800 0.1669 0.8656 0.8512 0.8583 0.9611
0.2937 10.71 900 0.1579 0.8765 0.8662 0.8713 0.9680
0.1075 11.9 1000 0.1883 0.8656 0.8512 0.8583 0.9633
0.1075 13.1 1100 0.1873 0.8765 0.8662 0.8713 0.9592
0.1075 14.29 1200 0.1725 0.8524 0.8595 0.8560 0.9668
0.1075 15.48 1300 0.1690 0.8679 0.8679 0.8679 0.9650
0.1075 16.67 1400 0.1959 0.8825 0.8662 0.8743 0.9668
0.0637 17.86 1500 0.1919 0.8723 0.8679 0.8701 0.9638
0.0637 19.05 1600 0.2020 0.8780 0.8662 0.8721 0.9663
0.0637 20.24 1700 0.2093 0.8716 0.8512 0.8613 0.9641
0.0637 21.43 1800 0.2184 0.8716 0.8629 0.8672 0.9643
0.0637 22.62 1900 0.2204 0.8576 0.8562 0.8569 0.9631
0.0452 23.81 2000 0.2478 0.8591 0.8562 0.8576 0.9621
0.0452 25.0 2100 0.2506 0.8769 0.8579 0.8673 0.9665
0.0452 26.19 2200 0.2270 0.8862 0.8729 0.8795 0.9690
0.0452 27.38 2300 0.2544 0.8790 0.8629 0.8709 0.9646
0.0452 28.57 2400 0.2251 0.8735 0.8662 0.8699 0.9643
0.0313 29.76 2500 0.2597 0.8668 0.8595 0.8631 0.9633
0.0313 30.95 2600 0.2635 0.8670 0.8612 0.8641 0.9643
0.0313 32.14 2700 0.2493 0.8752 0.8679 0.8715 0.9665
0.0313 33.33 2800 0.2565 0.8797 0.8679 0.8737 0.9660
0.0313 34.52 2900 0.2626 0.8831 0.8712 0.8771 0.9672
0.0218 35.71 3000 0.2750 0.8639 0.8595 0.8617 0.9650
0.0218 36.9 3100 0.2683 0.8682 0.8595 0.8639 0.9660
0.0218 38.1 3200 0.2751 0.8724 0.8579 0.8651 0.9660
0.0218 39.29 3300 0.2851 0.8746 0.8629 0.8687 0.9655
0.0218 40.48 3400 0.2737 0.8805 0.8629 0.8716 0.9692
0.0111 41.67 3500 0.2638 0.8773 0.8729 0.8751 0.9699
0.0111 42.86 3600 0.2773 0.8879 0.8746 0.8812 0.9692
0.0111 44.05 3700 0.2829 0.8759 0.8612 0.8685 0.9653
0.0111 45.24 3800 0.2730 0.8739 0.8696 0.8718 0.9699
0.0111 46.43 3900 0.2873 0.8767 0.8679 0.8723 0.9687
0.0039 47.62 4000 0.2797 0.8788 0.8729 0.8758 0.9690
0.0039 48.81 4100 0.2769 0.8805 0.8746 0.8775 0.9707
0.0039 50.0 4200 0.2842 0.8818 0.8612 0.8714 0.9694
0.0039 51.19 4300 0.2837 0.8822 0.8763 0.8792 0.9712
0.0039 52.38 4400 0.2895 0.8767 0.8679 0.8723 0.9704
0.0022 53.57 4500 0.2901 0.8822 0.8763 0.8792 0.9712
0.0022 54.76 4600 0.2950 0.8851 0.8763 0.8807 0.9709
0.0022 55.95 4700 0.2977 0.8851 0.8763 0.8807 0.9709
0.0022 57.14 4800 0.2984 0.8851 0.8763 0.8807 0.9709
0.0022 58.33 4900 0.2983 0.8851 0.8763 0.8807 0.9709
0.0013 59.52 5000 0.2981 0.8851 0.8763 0.8807 0.9709

Framework versions