generated_from_keras_callback

<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->

rahul77/t5-small-finetuned-thehindu1

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Train Loss Validation Loss Train Rouge1 Train Rouge2 Train Rougel Train Rougelsum Train Gen Len Epoch
1.2252 0.9927 25.8031 17.7261 23.4483 25.0648 19.0 0
1.0509 0.9137 28.0482 20.6823 25.5396 27.0125 19.0 1
0.9961 0.8638 28.2964 22.1783 26.4157 27.4368 19.0 2
0.9266 0.8321 27.7054 21.8853 26.0306 26.9068 19.0 3
0.8851 0.8117 28.3740 22.8198 26.8479 27.5047 19.0 4
0.8505 0.7975 28.7979 23.1437 27.0745 27.7887 19.0 5
0.8247 0.7890 28.9634 23.3567 27.3117 28.0320 19.0 6
0.8154 0.7827 28.8667 23.4468 27.1404 27.8453 19.0 7
0.7889 0.7813 29.0498 23.6403 27.5662 28.1518 19.0 8
0.7676 0.7774 29.1829 23.5778 27.7014 28.3268 19.0 9
0.7832 0.7714 29.1040 23.3700 27.6605 28.2650 19.0 10
0.7398 0.7676 29.1040 23.3700 27.6605 28.2650 19.0 11
0.7473 0.7644 29.4387 24.1983 27.9842 28.5700 19.0 12
0.7270 0.7628 29.3128 24.1484 27.8565 28.4215 19.0 13
0.7174 0.7615 29.3128 24.1484 27.8565 28.4215 19.0 14
0.7231 0.7577 29.3838 23.9483 27.6550 28.3416 19.0 15
0.7099 0.7558 29.4866 24.1703 27.8649 28.4404 19.0 16
0.7060 0.7548 29.4866 24.1703 27.8649 28.4404 19.0 17
0.6884 0.7539 29.4866 24.1703 27.8649 28.4404 19.0 18
0.6778 0.7546 29.4866 24.1703 27.8649 28.4404 19.0 19
0.6586 0.7519 29.4866 24.1703 27.8649 28.4404 19.0 20
0.6474 0.7521 29.4866 24.1703 27.8649 28.4404 19.0 21
0.6392 0.7527 29.4866 24.1703 27.8649 28.4404 19.0 22
0.6424 0.7537 29.4866 24.1703 27.8649 28.4404 19.0 23
0.6184 0.7536 29.4866 24.1703 27.8649 28.4404 19.0 24
0.6164 0.7520 29.4866 24.0547 27.7388 28.3416 19.0 25
0.6115 0.7502 29.4866 23.9746 27.8232 28.4227 19.0 26
0.6056 0.7498 29.4866 23.9746 27.8232 28.4227 19.0 27
0.6004 0.7488 29.4451 23.7671 27.5435 28.2982 19.0 28
0.5851 0.7478 29.4451 23.7671 27.5435 28.2982 19.0 29
0.5777 0.7496 29.4866 23.9746 27.8232 28.4227 19.0 30
0.5751 0.7486 29.4866 23.9746 27.8232 28.4227 19.0 31
0.5730 0.7485 29.4866 23.9746 27.8232 28.4227 19.0 32
0.5487 0.7499 29.4962 24.0563 27.8422 28.4356 19.0 33
0.5585 0.7517 29.4962 24.0563 27.8422 28.4356 19.0 34
0.5450 0.7538 29.4962 24.0563 27.8422 28.4356 19.0 35
0.5427 0.7509 29.4962 24.0563 27.8422 28.4356 19.0 36
0.5287 0.7500 29.4962 24.0563 27.8422 28.4356 19.0 37
0.5231 0.7486 29.4962 24.0563 27.8422 28.4356 19.0 38
0.5155 0.7523 29.4962 24.0563 27.8422 28.4356 19.0 39
0.5105 0.7550 29.4962 24.0563 27.8422 28.4356 19.0 40
0.5175 0.7557 29.6736 24.3120 28.0332 28.5828 19.0 41
0.5053 0.7560 29.6736 24.3120 28.0332 28.5828 19.0 42
0.4928 0.7548 29.6736 24.3120 28.0332 28.5828 19.0 43
0.4913 0.7568 29.6559 24.0992 27.7417 28.4408 19.0 44
0.4841 0.7574 29.6559 24.0992 27.7417 28.4408 19.0 45
0.4770 0.7583 29.6736 24.3120 28.0332 28.5828 19.0 46
0.4727 0.7581 29.6736 24.3120 28.0332 28.5828 19.0 47
0.4612 0.7623 29.6736 24.3120 28.0332 28.5828 19.0 48
0.4672 0.7612 29.6559 24.0992 27.7417 28.4408 19.0 49

Framework versions