generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

lilt-en-funsd

This model is a fine-tuned version of SCUT-DLVCLab/lilt-roberta-en-base on the funsd-layoutlmv3 dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
0.4201 10.53 200 0.8003 {'precision': 0.8321995464852607, 'recall': 0.8984088127294981, 'f1': 0.8640376692171865, 'number': 817} {'precision': 0.5714285714285714, 'recall': 0.5714285714285714, 'f1': 0.5714285714285714, 'number': 119} {'precision': 0.8651079136690647, 'recall': 0.89322191272052, 'f1': 0.8789401553220649, 'number': 1077} 0.8348 0.8763 0.8551 0.8104
0.0376 21.05 400 1.3158 {'precision': 0.8395904436860068, 'recall': 0.9033047735618115, 'f1': 0.8702830188679245, 'number': 817} {'precision': 0.4785714285714286, 'recall': 0.5630252100840336, 'f1': 0.5173745173745175, 'number': 119} {'precision': 0.8887814313346228, 'recall': 0.8532961931290622, 'f1': 0.8706774040738986, 'number': 1077} 0.8397 0.8564 0.8480 0.7934
0.0119 31.58 600 1.4791 {'precision': 0.8752941176470588, 'recall': 0.9106487148102815, 'f1': 0.8926214757048591, 'number': 817} {'precision': 0.5401459854014599, 'recall': 0.6218487394957983, 'f1': 0.578125, 'number': 119} {'precision': 0.8818681318681318, 'recall': 0.8941504178272981, 'f1': 0.8879668049792531, 'number': 1077} 0.8567 0.8847 0.8705 0.7961
0.0061 42.11 800 1.5605 {'precision': 0.8617886178861789, 'recall': 0.9082007343941249, 'f1': 0.8843861740166865, 'number': 817} {'precision': 0.5963302752293578, 'recall': 0.5462184873949579, 'f1': 0.5701754385964912, 'number': 119} {'precision': 0.8747763864042933, 'recall': 0.9080779944289693, 'f1': 0.8911161731207289, 'number': 1077} 0.8549 0.8867 0.8705 0.7965
0.0026 52.63 1000 1.5172 {'precision': 0.8596491228070176, 'recall': 0.8996328029375765, 'f1': 0.8791866028708135, 'number': 817} {'precision': 0.7176470588235294, 'recall': 0.5126050420168067, 'f1': 0.5980392156862744, 'number': 119} {'precision': 0.8737864077669902, 'recall': 0.9192200557103064, 'f1': 0.8959276018099548, 'number': 1077} 0.8616 0.8872 0.8742 0.8014
0.0019 63.16 1200 1.6132 {'precision': 0.8735224586288416, 'recall': 0.9045287637698899, 'f1': 0.888755261575466, 'number': 817} {'precision': 0.6460176991150443, 'recall': 0.6134453781512605, 'f1': 0.6293103448275863, 'number': 119} {'precision': 0.881508078994614, 'recall': 0.9117920148560817, 'f1': 0.8963943404837974, 'number': 1077} 0.8654 0.8912 0.8781 0.8040
0.0012 73.68 1400 1.6459 {'precision': 0.8831942789034565, 'recall': 0.9069767441860465, 'f1': 0.894927536231884, 'number': 817} {'precision': 0.6213592233009708, 'recall': 0.5378151260504201, 'f1': 0.5765765765765765, 'number': 119} {'precision': 0.8998178506375227, 'recall': 0.9173630454967502, 'f1': 0.9085057471264367, 'number': 1077} 0.8789 0.8907 0.8848 0.8068
0.0005 84.21 1600 1.5619 {'precision': 0.8602771362586605, 'recall': 0.9118727050183598, 'f1': 0.8853238265002972, 'number': 817} {'precision': 0.6631578947368421, 'recall': 0.5294117647058824, 'f1': 0.5887850467289719, 'number': 119} {'precision': 0.8944494995450409, 'recall': 0.9127205199628597, 'f1': 0.9034926470588234, 'number': 1077} 0.8694 0.8897 0.8795 0.8155
0.0003 94.74 1800 1.6571 {'precision': 0.8649592549476135, 'recall': 0.9094247246022031, 'f1': 0.886634844868735, 'number': 817} {'precision': 0.6391752577319587, 'recall': 0.5210084033613446, 'f1': 0.5740740740740741, 'number': 119} {'precision': 0.8971792538671519, 'recall': 0.9155060352831941, 'f1': 0.90625, 'number': 1077} 0.8715 0.8897 0.8805 0.8098
0.0003 105.26 2000 1.6731 {'precision': 0.8672875436554133, 'recall': 0.9118727050183598, 'f1': 0.8890214797136038, 'number': 817} {'precision': 0.62, 'recall': 0.5210084033613446, 'f1': 0.5662100456621004, 'number': 119} {'precision': 0.9008264462809917, 'recall': 0.9108635097493036, 'f1': 0.9058171745152355, 'number': 1077} 0.8730 0.8882 0.8806 0.8071

Framework versions