generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

my_awesome_opus_books_model

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 1 10.1215 0.0 19.0
No log 2.0 2 10.1215 0.0 19.0
No log 3.0 3 10.1215 0.0 19.0
No log 4.0 4 9.9493 0.0 19.0
No log 5.0 5 9.7067 0.0 19.0
No log 6.0 6 9.5209 0.0 19.0
No log 7.0 7 9.1640 0.0 19.0
No log 8.0 8 9.1640 0.0 19.0
No log 9.0 9 8.9257 0.0 19.0
No log 10.0 10 8.7095 0.0 19.0
No log 11.0 11 8.0234 0.0 19.0
No log 12.0 12 7.6148 0.0 19.0
No log 13.0 13 7.6148 0.0 19.0
No log 14.0 14 7.3894 0.0 19.0
No log 15.0 15 7.1168 0.0 19.0
No log 16.0 16 6.9173 0.0 19.0
No log 17.0 17 6.7148 0.0 19.0
No log 18.0 18 6.3630 0.0 19.0
No log 19.0 19 6.0068 0.0 19.0
No log 20.0 20 5.8264 0.0 19.0
No log 21.0 21 5.6897 0.0 19.0
No log 22.0 22 5.5416 0.0 19.0
No log 23.0 23 5.4310 0.0 19.0
No log 24.0 24 5.3268 0.6787 19.0
No log 25.0 25 5.2214 2.6287 19.0
No log 26.0 26 5.0786 2.6287 19.0
No log 27.0 27 4.9850 3.2603 19.0
No log 28.0 28 4.9030 3.6542 19.0
No log 29.0 29 4.8184 3.6542 19.0
No log 30.0 30 4.7408 3.6542 19.0
No log 31.0 31 4.6692 3.6542 19.0
No log 32.0 32 4.5869 3.6542 19.0
No log 33.0 33 4.4861 3.6542 19.0
No log 34.0 34 4.3921 3.6542 19.0
No log 35.0 35 4.3102 3.6542 19.0
No log 36.0 36 4.2375 3.6542 19.0
No log 37.0 37 4.1691 3.6542 19.0
No log 38.0 38 4.1019 3.6542 19.0
No log 39.0 39 4.0349 3.6542 19.0
No log 40.0 40 3.9652 3.6542 19.0
No log 41.0 41 3.8937 3.6542 19.0
No log 42.0 42 3.8232 3.6542 19.0
No log 43.0 43 3.7526 3.6542 19.0
No log 44.0 44 3.6845 3.6542 19.0
No log 45.0 45 3.6196 3.6542 19.0
No log 46.0 46 3.5549 3.6542 19.0
No log 47.0 47 3.4897 3.6542 19.0
No log 48.0 48 3.4227 3.6542 19.0
No log 49.0 49 3.3559 3.6542 19.0
No log 50.0 50 3.2901 3.6542 19.0
No log 51.0 51 3.2237 3.6542 19.0
No log 52.0 52 3.1568 3.6542 19.0
No log 53.0 53 3.0880 3.6542 19.0
No log 54.0 54 3.0184 3.6542 19.0
No log 55.0 55 2.9428 3.6542 19.0
No log 56.0 56 2.8787 3.6542 19.0
No log 57.0 57 2.8177 3.6542 19.0
No log 58.0 58 2.7606 3.6542 19.0
No log 59.0 59 2.7053 3.6542 19.0
No log 60.0 60 2.6458 3.6542 19.0
No log 61.0 61 2.5915 3.6542 19.0
No log 62.0 62 2.5416 3.6542 19.0
No log 63.0 63 2.4929 3.6542 19.0
No log 64.0 64 2.4465 3.6542 19.0
No log 65.0 65 2.4007 3.6542 19.0
No log 66.0 66 2.3560 3.6542 19.0
No log 67.0 67 2.3136 3.6542 19.0
No log 68.0 68 2.2712 3.6542 19.0
No log 69.0 69 2.2313 3.6542 19.0
No log 70.0 70 2.1924 3.6542 19.0
No log 71.0 71 2.1563 3.6542 19.0
No log 72.0 72 2.1213 3.6542 19.0
No log 73.0 73 2.0885 3.6542 19.0
No log 74.0 74 2.0577 3.6542 19.0
No log 75.0 75 2.0293 3.6542 19.0
No log 76.0 76 2.0023 3.6542 19.0
No log 77.0 77 1.9762 3.6542 19.0
No log 78.0 78 1.9514 3.6542 19.0
No log 79.0 79 1.9288 3.6542 19.0
No log 80.0 80 1.9076 3.6542 19.0
No log 81.0 81 1.8876 3.6542 19.0
No log 82.0 82 1.8691 3.6542 19.0
No log 83.0 83 1.8520 3.6542 19.0
No log 84.0 84 1.8362 3.6542 19.0
No log 85.0 85 1.8217 1.2446 15.2
No log 86.0 86 1.8080 1.2446 15.2
No log 87.0 87 1.7957 0.1327 11.4
No log 88.0 88 1.7846 0.1327 11.4
No log 89.0 89 1.7743 0.1327 11.4
No log 90.0 90 1.7651 0.1327 11.4
No log 91.0 91 1.7569 0.1327 11.4
No log 92.0 92 1.7493 0.1327 11.4
No log 93.0 93 1.7426 0.1327 11.4
No log 94.0 94 1.7367 0.1327 11.4
No log 95.0 95 1.7320 0.1327 11.4
No log 96.0 96 1.7273 0.1327 11.4
No log 97.0 97 1.7235 0.1327 11.4
No log 98.0 98 1.7200 0.1327 11.4
No log 99.0 99 1.7170 0.1327 11.4
No log 100.0 100 1.7142 0.1327 11.4

Framework versions