<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
layoutlm-funsd
This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:
- Loss: 0.6961
- Answer: {'precision': 0.6967741935483871, 'recall': 0.8009888751545118, 'f1': 0.7452558941920645, 'number': 809}
- Header: {'precision': 0.37606837606837606, 'recall': 0.3697478991596639, 'f1': 0.3728813559322034, 'number': 119}
- Question: {'precision': 0.7607361963190185, 'recall': 0.8150234741784037, 'f1': 0.786944696282865, 'number': 1065}
- Overall Precision: 0.7130
- Overall Recall: 0.7827
- Overall F1: 0.7462
- Overall Accuracy: 0.8070
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|
1.7906 | 1.0 | 10 | 1.5819 | {'precision': 0.01384083044982699, 'recall': 0.014833127317676144, 'f1': 0.01431980906921241, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.18016928657799275, 'recall': 0.13990610328638498, 'f1': 0.15750528541226216, 'number': 1065} | 0.0950 | 0.0808 | 0.0873 | 0.3655 |
1.48 | 2.0 | 20 | 1.2471 | {'precision': 0.09946949602122016, 'recall': 0.09270704573547589, 'f1': 0.09596928982725526, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.441332323996972, 'recall': 0.5474178403755868, 'f1': 0.4886839899413244, 'number': 1065} | 0.3171 | 0.3302 | 0.3235 | 0.5773 |
1.1166 | 3.0 | 30 | 0.9644 | {'precision': 0.4272237196765499, 'recall': 0.39184177997527814, 'f1': 0.4087685364281109, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.6081447963800904, 'recall': 0.6309859154929578, 'f1': 0.6193548387096773, 'number': 1065} | 0.5300 | 0.4962 | 0.5126 | 0.6845 |
0.8575 | 4.0 | 40 | 0.8203 | {'precision': 0.5973597359735974, 'recall': 0.6711990111248455, 'f1': 0.6321303841676369, 'number': 809} | {'precision': 0.07317073170731707, 'recall': 0.025210084033613446, 'f1': 0.0375, 'number': 119} | {'precision': 0.6579406631762653, 'recall': 0.707981220657277, 'f1': 0.6820443238353686, 'number': 1065} | 0.6202 | 0.6523 | 0.6359 | 0.7443 |
0.704 | 5.0 | 50 | 0.7463 | {'precision': 0.6459893048128342, 'recall': 0.7466007416563659, 'f1': 0.6926605504587156, 'number': 809} | {'precision': 0.12345679012345678, 'recall': 0.08403361344537816, 'f1': 0.1, 'number': 119} | {'precision': 0.6824212271973465, 'recall': 0.7727699530516432, 'f1': 0.7247908410391898, 'number': 1065} | 0.6467 | 0.7210 | 0.6819 | 0.7697 |
0.5755 | 6.0 | 60 | 0.6990 | {'precision': 0.6606382978723404, 'recall': 0.7676143386897404, 'f1': 0.7101200686106346, 'number': 809} | {'precision': 0.25, 'recall': 0.15966386554621848, 'f1': 0.19487179487179487, 'number': 119} | {'precision': 0.6919431279620853, 'recall': 0.8225352112676056, 'f1': 0.7516087516087516, 'number': 1065} | 0.6643 | 0.7607 | 0.7092 | 0.7805 |
0.5027 | 7.0 | 70 | 0.6981 | {'precision': 0.6730975348338692, 'recall': 0.7762669962917181, 'f1': 0.7210103329506314, 'number': 809} | {'precision': 0.23636363636363636, 'recall': 0.2184873949579832, 'f1': 0.22707423580786026, 'number': 119} | {'precision': 0.7094650205761317, 'recall': 0.8093896713615023, 'f1': 0.756140350877193, 'number': 1065} | 0.6714 | 0.7607 | 0.7132 | 0.7843 |
0.4478 | 8.0 | 80 | 0.6776 | {'precision': 0.6840425531914893, 'recall': 0.7948084054388134, 'f1': 0.7352773013150371, 'number': 809} | {'precision': 0.27450980392156865, 'recall': 0.23529411764705882, 'f1': 0.2533936651583711, 'number': 119} | {'precision': 0.7285595337218984, 'recall': 0.8215962441314554, 'f1': 0.7722859664607237, 'number': 1065} | 0.6893 | 0.7757 | 0.7299 | 0.7937 |
0.3999 | 9.0 | 90 | 0.6768 | {'precision': 0.6885069817400644, 'recall': 0.792336217552534, 'f1': 0.7367816091954023, 'number': 809} | {'precision': 0.3333333333333333, 'recall': 0.2857142857142857, 'f1': 0.30769230769230765, 'number': 119} | {'precision': 0.7410256410256411, 'recall': 0.8140845070422535, 'f1': 0.7758389261744967, 'number': 1065} | 0.7000 | 0.7737 | 0.7350 | 0.7973 |
0.3587 | 10.0 | 100 | 0.6864 | {'precision': 0.6861471861471862, 'recall': 0.7836835599505563, 'f1': 0.7316791690709752, 'number': 809} | {'precision': 0.2803030303030303, 'recall': 0.31092436974789917, 'f1': 0.29482071713147406, 'number': 119} | {'precision': 0.7372448979591837, 'recall': 0.8140845070422535, 'f1': 0.7737617135207497, 'number': 1065} | 0.6891 | 0.7717 | 0.7280 | 0.7962 |
0.3188 | 11.0 | 110 | 0.6910 | {'precision': 0.6850477200424178, 'recall': 0.7985166872682324, 'f1': 0.7374429223744292, 'number': 809} | {'precision': 0.3114754098360656, 'recall': 0.31932773109243695, 'f1': 0.3153526970954357, 'number': 119} | {'precision': 0.7480451781059948, 'recall': 0.8084507042253521, 'f1': 0.7770758122743683, 'number': 1065} | 0.6972 | 0.7752 | 0.7341 | 0.7997 |
0.3079 | 12.0 | 120 | 0.6895 | {'precision': 0.6939655172413793, 'recall': 0.796044499381953, 'f1': 0.7415083477259643, 'number': 809} | {'precision': 0.3333333333333333, 'recall': 0.31932773109243695, 'f1': 0.3261802575107296, 'number': 119} | {'precision': 0.7649122807017544, 'recall': 0.8187793427230047, 'f1': 0.7909297052154195, 'number': 1065} | 0.7122 | 0.7797 | 0.7444 | 0.8029 |
0.2923 | 13.0 | 130 | 0.6960 | {'precision': 0.7010869565217391, 'recall': 0.7972805933250927, 'f1': 0.746096009253904, 'number': 809} | {'precision': 0.34615384615384615, 'recall': 0.37815126050420167, 'f1': 0.36144578313253006, 'number': 119} | {'precision': 0.7605263157894737, 'recall': 0.8140845070422535, 'f1': 0.7863945578231293, 'number': 1065} | 0.7110 | 0.7812 | 0.7444 | 0.8020 |
0.2794 | 14.0 | 140 | 0.6956 | {'precision': 0.6855614973262032, 'recall': 0.792336217552534, 'f1': 0.7350917431192662, 'number': 809} | {'precision': 0.3644067796610169, 'recall': 0.36134453781512604, 'f1': 0.3628691983122363, 'number': 119} | {'precision': 0.7602085143353605, 'recall': 0.8215962441314554, 'f1': 0.78971119133574, 'number': 1065} | 0.7074 | 0.7822 | 0.7429 | 0.8052 |
0.2764 | 15.0 | 150 | 0.6961 | {'precision': 0.6967741935483871, 'recall': 0.8009888751545118, 'f1': 0.7452558941920645, 'number': 809} | {'precision': 0.37606837606837606, 'recall': 0.3697478991596639, 'f1': 0.3728813559322034, 'number': 119} | {'precision': 0.7607361963190185, 'recall': 0.8150234741784037, 'f1': 0.786944696282865, 'number': 1065} | 0.7130 | 0.7827 | 0.7462 | 0.8070 |
Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3