generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

plt5-seq-clf-with-entities-updated-50-finetuned

This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy Recall F1 Precision
1.0346 1.0 718 0.9563 {'accuracy': 0.6274193548387097} {'recall': 0.6274193548387097} {'f1': 0.6269092308889221} {'precision': 0.6516260612426348}
0.9931 2.0 1436 0.9478 {'accuracy': 0.6161290322580645} {'recall': 0.6161290322580645} {'f1': 0.6238406780518579} {'precision': 0.6864218347180107}
0.9463 3.0 2154 0.9055 {'accuracy': 0.6435483870967742} {'recall': 0.6435483870967742} {'f1': 0.6506881166603742} {'precision': 0.698976830092863}
0.9207 4.0 2872 0.9370 {'accuracy': 0.6225806451612903} {'recall': 0.6225806451612903} {'f1': 0.6300451742807381} {'precision': 0.7149141775401143}
0.8799 5.0 3590 0.9878 {'accuracy': 0.6161290322580645} {'recall': 0.6161290322580645} {'f1': 0.6122209742857827} {'precision': 0.7137447198652702}
0.8561 6.0 4308 0.8395 {'accuracy': 0.6645161290322581} {'recall': 0.6645161290322581} {'f1': 0.6682908136199321} {'precision': 0.7247633436486038}
0.8277 7.0 5026 0.8478 {'accuracy': 0.6612903225806451} {'recall': 0.6612903225806451} {'f1': 0.6701602282855885} {'precision': 0.7359181091343896}
0.7946 8.0 5744 0.8521 {'accuracy': 0.667741935483871} {'recall': 0.667741935483871} {'f1': 0.6706689860918258} {'precision': 0.7498766646363035}
0.7837 9.0 6462 0.7798 {'accuracy': 0.6838709677419355} {'recall': 0.6838709677419355} {'f1': 0.690291461468391} {'precision': 0.7206099181990787}
0.7594 10.0 7180 0.8374 {'accuracy': 0.6758064516129032} {'recall': 0.6758064516129032} {'f1': 0.682680721421726} {'precision': 0.7493728723436673}
0.7466 11.0 7898 0.8326 {'accuracy': 0.6854838709677419} {'recall': 0.6854838709677419} {'f1': 0.6894194311810349} {'precision': 0.7449032499042901}
0.7206 12.0 8616 0.7420 {'accuracy': 0.7032258064516129} {'recall': 0.7032258064516129} {'f1': 0.7087107256097728} {'precision': 0.7508476827365588}
0.7055 13.0 9334 0.7503 {'accuracy': 0.6967741935483871} {'recall': 0.6967741935483871} {'f1': 0.7029067581475338} {'precision': 0.7372716433452716}
0.6931 14.0 10052 0.7804 {'accuracy': 0.6854838709677419} {'recall': 0.6854838709677419} {'f1': 0.6918059328051079} {'precision': 0.743185622817053}
0.6939 15.0 10770 0.7469 {'accuracy': 0.6887096774193548} {'recall': 0.6887096774193548} {'f1': 0.6952994535589653} {'precision': 0.7335588163636623}
0.6685 16.0 11488 0.7322 {'accuracy': 0.7225806451612903} {'recall': 0.7225806451612903} {'f1': 0.7260667740259684} {'precision': 0.7520970078354011}
0.6798 17.0 12206 0.7457 {'accuracy': 0.7064516129032258} {'recall': 0.7064516129032258} {'f1': 0.71031473860959} {'precision': 0.7517696108635532}
0.6566 18.0 12924 0.7392 {'accuracy': 0.7064516129032258} {'recall': 0.7064516129032258} {'f1': 0.711536979529592} {'precision': 0.7566559266538243}
0.6509 19.0 13642 0.7349 {'accuracy': 0.7032258064516129} {'recall': 0.7032258064516129} {'f1': 0.7078754315336451} {'precision': 0.7432671014815916}
0.6307 20.0 14360 0.7296 {'accuracy': 0.7032258064516129} {'recall': 0.7032258064516129} {'f1': 0.7081514019750476} {'precision': 0.7432360333344489}

Framework versions