generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.8092 1.0 10 1.6181 {'precision': 0.013850415512465374, 'recall': 0.006180469715698393, 'f1': 0.008547008547008546, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.26586102719033233, 'recall': 0.08262910798122065, 'f1': 0.12607449856733524, 'number': 1065} 0.1344 0.0467 0.0693 0.3144
1.4937 2.0 20 1.2877 {'precision': 0.1342925659472422, 'recall': 0.138442521631644, 'f1': 0.1363359707851491, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.34902488231338263, 'recall': 0.48732394366197185, 'f1': 0.4067398119122257, 'number': 1065} 0.2719 0.3166 0.2925 0.5718
1.1349 3.0 30 0.9891 {'precision': 0.45396825396825397, 'recall': 0.5302843016069221, 'f1': 0.48916761687571264, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.5338582677165354, 'recall': 0.6366197183098592, 'f1': 0.5807280513918629, 'number': 1065} 0.4998 0.5554 0.5261 0.6801
0.8493 4.0 40 0.8234 {'precision': 0.580259222333001, 'recall': 0.7194066749072929, 'f1': 0.6423841059602649, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.6639928698752228, 'recall': 0.6995305164319249, 'f1': 0.6812985825331503, 'number': 1065} 0.6189 0.6658 0.6415 0.7323
0.6989 5.0 50 0.7466 {'precision': 0.6263048016701461, 'recall': 0.7416563658838071, 'f1': 0.6791171477079797, 'number': 809} {'precision': 0.11688311688311688, 'recall': 0.07563025210084033, 'f1': 0.09183673469387756, 'number': 119} {'precision': 0.7050043898156277, 'recall': 0.7539906103286385, 'f1': 0.7286751361161524, 'number': 1065} 0.6495 0.7085 0.6777 0.7609
0.5795 6.0 60 0.6809 {'precision': 0.6511387163561076, 'recall': 0.7775030902348579, 'f1': 0.7087323943661973, 'number': 809} {'precision': 0.13095238095238096, 'recall': 0.09243697478991597, 'f1': 0.10837438423645321, 'number': 119} {'precision': 0.705229793977813, 'recall': 0.8356807511737089, 'f1': 0.7649333906317147, 'number': 1065} 0.6618 0.7677 0.7108 0.7872
0.508 7.0 70 0.6786 {'precision': 0.667375132837407, 'recall': 0.7762669962917181, 'f1': 0.7177142857142856, 'number': 809} {'precision': 0.1875, 'recall': 0.17647058823529413, 'f1': 0.1818181818181818, 'number': 119} {'precision': 0.7431972789115646, 'recall': 0.8206572769953052, 'f1': 0.7800089245872378, 'number': 1065} 0.6833 0.7642 0.7215 0.7871
0.4558 8.0 80 0.6540 {'precision': 0.6932314410480349, 'recall': 0.7849196538936959, 'f1': 0.736231884057971, 'number': 809} {'precision': 0.24, 'recall': 0.20168067226890757, 'f1': 0.2191780821917808, 'number': 119} {'precision': 0.7464195450716091, 'recall': 0.831924882629108, 'f1': 0.7868561278863234, 'number': 1065} 0.7013 0.7752 0.7364 0.7997
0.3978 9.0 90 0.6650 {'precision': 0.6969026548672567, 'recall': 0.7787391841779975, 'f1': 0.7355516637478109, 'number': 809} {'precision': 0.2644628099173554, 'recall': 0.2689075630252101, 'f1': 0.2666666666666667, 'number': 119} {'precision': 0.7474662162162162, 'recall': 0.8309859154929577, 'f1': 0.7870164517563362, 'number': 1065} 0.7003 0.7762 0.7363 0.7951
0.3612 10.0 100 0.6679 {'precision': 0.7168742921857305, 'recall': 0.7824474660074165, 'f1': 0.7482269503546098, 'number': 809} {'precision': 0.30701754385964913, 'recall': 0.29411764705882354, 'f1': 0.30042918454935624, 'number': 119} {'precision': 0.7629757785467128, 'recall': 0.828169014084507, 'f1': 0.7942368302566412, 'number': 1065} 0.7199 0.7777 0.7477 0.8026
0.3168 11.0 110 0.6831 {'precision': 0.6914778856526429, 'recall': 0.792336217552534, 'f1': 0.7384792626728109, 'number': 809} {'precision': 0.3238095238095238, 'recall': 0.2857142857142857, 'f1': 0.30357142857142855, 'number': 119} {'precision': 0.7770979020979021, 'recall': 0.8347417840375587, 'f1': 0.8048890900860118, 'number': 1065} 0.7188 0.7847 0.7503 0.7978
0.3054 12.0 120 0.6884 {'precision': 0.6985539488320356, 'recall': 0.7762669962917181, 'f1': 0.7353629976580796, 'number': 809} {'precision': 0.3559322033898305, 'recall': 0.35294117647058826, 'f1': 0.35443037974683544, 'number': 119} {'precision': 0.7755102040816326, 'recall': 0.8206572769953052, 'f1': 0.7974452554744526, 'number': 1065} 0.7201 0.7747 0.7464 0.7985
0.2913 13.0 130 0.6794 {'precision': 0.7107344632768362, 'recall': 0.7775030902348579, 'f1': 0.7426210153482882, 'number': 809} {'precision': 0.36134453781512604, 'recall': 0.36134453781512604, 'f1': 0.36134453781512604, 'number': 119} {'precision': 0.7737478411053541, 'recall': 0.8413145539906103, 'f1': 0.8061178587494376, 'number': 1065} 0.7253 0.7868 0.7548 0.8039
0.2775 14.0 140 0.6870 {'precision': 0.7040358744394619, 'recall': 0.7762669962917181, 'f1': 0.7383891828336272, 'number': 809} {'precision': 0.3728813559322034, 'recall': 0.3697478991596639, 'f1': 0.37130801687763715, 'number': 119} {'precision': 0.7757255936675461, 'recall': 0.828169014084507, 'f1': 0.8010899182561309, 'number': 1065} 0.7238 0.7797 0.7507 0.8014
0.2706 15.0 150 0.6897 {'precision': 0.7033707865168539, 'recall': 0.7737948084054388, 'f1': 0.736904061212478, 'number': 809} {'precision': 0.3697478991596639, 'recall': 0.3697478991596639, 'f1': 0.3697478991596639, 'number': 119} {'precision': 0.7738825591586328, 'recall': 0.8291079812206573, 'f1': 0.800543970988214, 'number': 1065} 0.7223 0.7792 0.7497 0.8012

Framework versions