generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

training_df_fullctxt_and_sent_split_filtered_0_15_PubMedBert

This model is a fine-tuned version of dmis-lab/TinyPubMedBERT-v1.0 on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Exact Match Precision Recall F1 Hashcode
0.4001 1.0 5881 0.3415 0.6842 0.6047 0.6120 0.6120 0.0 [0.8383916616439819, 0.960318922996521] [0.7912731170654297, 0.963049054145813] [0.8141512274742126, 0.9616820812225342] roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.28.0)
0.3165 2.0 11762 0.3255 0.7947 0.6870 0.6369 0.6369 0.0 [0.8562091588973999, 0.9591262340545654] [0.841107964515686, 0.9619568586349487] [0.8485913872718811, 0.9605394601821899] roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.28.0)
0.2971 3.0 17643 0.3178 0.8168 0.6965 0.6365 0.6365 0.0 [0.8633116483688354, 0.978273868560791] [0.8504236936569214, 0.9788444638252258] [0.856819212436676, 0.9785590767860413] roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.28.0)
0.2853 4.0 23524 0.2934 0.8134 0.7020 0.6328 0.6328 0.0 [0.8643838167190552, 0.9647811651229858] [0.8536887764930725, 0.9682695865631104] [0.859002947807312, 0.9665222764015198] roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.28.0)
0.2744 5.0 29405 0.2968 0.8664 0.7077 0.6357 0.6357 0.0 [0.8695193529129028, 0.9638710021972656] [0.8581283688545227, 0.9666727185249329] [0.8637862205505371, 0.9652698636054993] roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.28.0)
0.2669 6.0 35286 0.3027 0.8472 0.6949 0.6378 0.6378 0.0 [0.8685003519058228, 0.9665455222129822] [0.8652210235595703, 0.9689881801605225] [0.8668575882911682, 0.9677652716636658] roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.28.0)
0.2595 7.0 41167 0.2996 0.8840 0.7193 0.6447 0.6447 0.0 [0.8698508143424988, 0.9638710021972656] [0.8639194965362549, 0.9666727185249329] [0.8668749332427979, 0.9652698636054993] roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.28.0)
0.253 8.0 47048 0.2972 0.8518 0.6891 0.6363 0.6363 0.0 [0.8666473031044006, 0.9638710021972656] [0.863062858581543, 0.9666727185249329] [0.8648514151573181, 0.9652698636054993] roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.28.0)
0.2481 9.0 52929 0.2985 0.8533 0.6843 0.6309 0.6309 0.0 [0.8691736459732056, 0.9647811651229858] [0.8661415576934814, 0.9682695865631104] [0.8676549196243286, 0.9665222764015198] roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.28.0)
0.243 10.0 58810 0.3031 0.8717 0.6989 0.6336 0.6336 0.0 [0.8712936639785767, 0.9647811651229858] [0.8689576387405396, 0.9682695865631104] [0.8701240420341492, 0.9665222764015198] roberta-large_L17_no-idf_version=0.3.12(hug_trans=4.28.0)

Framework versions